Table of contents
  1. Logging
    1. Querying
      1. By Time
    2. CLI
      1. READ LOGS
        1. Quick Scripts
    3. STORAGE
      1. Routing
      2. Data Buckets
      3. Trigger Functions
    4. API
      1. Entries List
    5. Steps I took for Logging
      1. __1) installed cli via PowerShell__
      2. __ 2) Set your default project so you don’t have to supply the –project flag with each command:__
      3. __ 3) FYI__
      4. __4) in cli get logs __
      5. **__
        1. Get Logging through API
      6. __ 6) AUTHENTICATED SERVICE ACCOUNT __
    6. Supported destinations




Logging

Logs Explorer

Logging Class Docs for List

Query Language Doc

C# GitHub Google.Cloud.Logging.V2

JSON return LogEntry


Querying

By Time

In the interface, you can set specific limits on the date and time of log entries to show. For example, if you add the following conditions to your
query, the preview displays exactly the log entries
in the indicated 30-minute period and you won’t be able to scroll outside of that date range:

timestamp >= “2016-11-29T23:00:00Z” timestamp <= “2016-11-29T23:30:00Z”

When writing a query with a timestamp, you must use dates and times in the format shown above.

You can also search log entries using timestamp shortcuts. For example, you can enter a date with a comparison operator to get all log entries after a
certain day:

timestamp > “2016-11-29”
Note: Logging interprets query expressions that use the YYYY-MM-DD format as YYYY-MM-DDT00:00:00Z.


CLI

READ LOGS

Quick Scripts

gcloud logging read "resource.type=global AND jsonPayload.queryResult.responseMessages.conversationSuccess.metadata.resolved:*" --freshness="1d" --resource-names="projects/vcu-virtual-assistant-bot" --limit=1    

STORAGE

Routing

Data Buckets

Trigger Functions


API

Entries List


Steps I took for Logging

__1) installed cli via PowerShell__

   (New-Object Net.WebClient).DownloadFile("https://dl.google.com/dl/cloudsdk/channels/rapid/GoogleCloudSDKInstaller.exe", "$env:Temp\GoogleCloudSDKInstaller.exe")

   & $env:Temp\GoogleCloudSDKInstaller.exe

__ 2) Set your default project so you don’t have to supply the –project flag with each command:__

gcloud config set project
your-project-name    

__ 3) FYI__

Cloud SDK requires Python; supported versions are Python 3 (3.5 to 3.9). By default, the Windows version of Cloud SDK comes bundled with Python 3. To use Cloud SDK, your operating system must be able to run a supported version of Python.    
    
The installer installs all necessary dependencies, including the needed Python version. While Cloud SDK installs and manages Python 3 by default, you can use an existing Python installation if necessary by unchecking the option to Install Bundled Python. See gcloud topic startup to learn how to use an existing Python installation.    

__4) in cli get logs __

gcloud gcloud logging read --freshness="50d"    

–freshness=FRESHNESS; default=”1d”

Return entries that are not older than this value. Works only with DESC ordering and filters without a timestamp. See $ gcloud topic datetimes for
information on duration formats.

**__

5) Can test api calls and parameters with API explorer </span>
__**

use request body

   {
  "resourceNames": [
    "projects/your-project-name"
  ],
  "filter": "resource.type=global",
  "orderBy": "timestamp desc"
}    

Get Logging through API

  • See if Logging API enabled
gcloud services list --project=your-project-name    

__ 6) AUTHENTICATED SERVICE ACCOUNT __

   gcloud auth login --cred-file=C:\Users\bp01232023\source\repos\Wiki\GoogleDialogFlow.wiki\FilesOfInterest\your-project-name-e9d441de07b0.json    

or

 gcloud auth activate-service-account google-cli-service-account@your-project-name.iam.gserviceaccount.com --key-file=C:\Users\bp01232023\source\repos\Wiki\GoogleDialogFlow.wiki\FilesOfInterest\your-project-name-e9d441de07b0.json    
  • Enable API
gcloud services enable logging --project=your-project-name    

Many Google Cloud Platform events are logged in Cloud Audit Logs. You can filter these logs and forward them to Pub/Sub topics using sinks. These Pub/Sub topics can then send notifications that trigger Cloud Functions. This allows you to create custom events from any Google Cloud Platform service that produces audit logs.

Route logs to new destinations with a Log Router and Sink

https://cloud.google.com/logging/docs/routing/overview#destinations

Sinks control how Cloud Logging routes logs. Using sinks, you can route some or all of your logs
to supported destinations. Some
of the reasons that you might want to control how your logs are routed include the following

example

https://cloud.google.com/functions/docs/samples/functions-log-stackdriver?hl=en#functions_log_stackdriver-java

Supported destinations

You can use the Log Router to route certain logs to supported destinations in any Cloud project. Logging supports the following sink destinations:

Cloud Logging log buckets: Provides storage in Cloud Logging. A log bucket can store
logs ingested by multiple Google Cloud projects.
You specify the data retention period, the data storage location, and the log-views on a log
bucket. Log views let you control which logs in a log
bucket that a user can access. Log buckets are recommended storage when you want to troubleshoot your applications and services, or to analyze your
log data. If you need to combine your Cloud Logging
data with other data sources, then you can store your logs in log buckets that are upgraded to use Log Analytics, and then link that bucket to
BigQuery. For information about viewing logs,
see Query and view logs overview
and View logs routed to Cloud Logging buckets.

Pub/Sub topics: Provides support for third-party integrations, such
as Splunk, with Logging. Log entries are formatted
into JSON and then delivered to a Pub/Sub topic. For information about viewing these logs, their organization, and how to configure a third-party
integration,
see View logs routed to Pub/Sub.

BigQuery datasets: Provides storage of log entries in BigQuery datasets. You can use big data analysis capabilities on the stored logs. If you need to
combine your Cloud Logging data with other data
sources, then you can route your logs to BigQuery. An alternative is to store your logs in log buckets that are upgraded to use Log Analytics and then
linked to BigQuery. For information about viewing
logs routed to BigQuery, see View logs routed to BigQuery.

Cloud Storage buckets: Provides inexpensive, long-term storage of log data in Cloud Storage. Log entries are stored as JSON files. For information
about viewing these logs, how they are organized, and
how late-arriving logs are handled, see View logs routed to Cloud Storage.