2.1.5-pre • Published 1 year ago

schemas-proto v2.1.5-pre

Weekly downloads
-
License
-
Repository
-
Last release
1 year ago

schemas-proto

Google Protocol Buffers schemas, including gRPC definitions

Code Generation

The generation of the protobuf and gRPC definitions are performed via a docker container. To generate the source (and use the pre-commit hook), make sure that you have docker installed and access to AWS ECR to pull down the container.

To set up your local docker daemon with aws credentials, it is worth using our ironkube tool.

Set Up ironkube

https://ironnet.atlassian.net/wiki/spaces/CO/pages/843743236/Setup+ironkube+SDK#SetupironkubeSDK-Installation

Set Up ironkube ecr helper

https://ironnet.atlassian.net/wiki/spaces/CO/pages/843743236/Setup+ironkube+SDK#SetupironkubeSDK-ConfigureECRHelper

Basically just run ironkube ecr once ironkube is installed

Generating code from protobuf

Finally, you should be set up. The schemas-proto Makefile should now magically work. Go back into the schemas-proto repo and manually generate the code:

make gen -j

or (for slower, serial builds but more readable logs)

make gen

If people are responsible and generate on every commit, you should see no changes via git status. To ensure this is true, keep reading.

Adding a new protobuf version target

If you're looking into adding a new protobuf version target then you might be starting the process of upgrading gRPC, protobuf, etc. In that case you should also check out the Schemas-proto, gRPC and Protobuf Upgrade Guide in addition to the schemas-proto specific information here.

make gen will generate multiple versions of the protobuf and gRPC definitions to accommodate migrating between versions of protobuf, gRPC, etc that could involve breaking changes. In order to add a new version:

  1. Create the Dockerfile for the new build env. This should have all the major dependencies, such as the correct gRPC and protobuf versions, that the generation steps will need. Make sure that this file is named Dockerfile_pb_{PROTOBUF_VERSION}.build_env, e.g. Dockerfile_pb_3_6_1.build_env.
  2. Add new variables and targets to the Makefile for this new version. This will include adding new version-specific variables, typically just a PROTOBUF_VERSION, e.g. %-3.13.0: PROTOBUF_VERSION = 3.13.0, and version-specific targets to build-env, pull-env, push-env, and gen, e.g. build-env: build-env-version-3.6.1 build-env-version-3.13.0.
  3. Test out the new build image following the steps in Modifying our Dockerfile.build-env?
  4. Add a new Java build (see upgrade guide) and optionally pin Python dependencies (consumed by projects that pull in our Python proto) in /pb-version/<new-version>/python/requirements.txt.

What if I am maintaining schemas-proto when it's a submodule within a parent repository?

Managing submodules as a part of a parent project is somewhat out of scope to discuss here, but if you're doing it then you'll want to be able to auto-generate our code bindings whenever proto file updates are made. The pre-commit hook can be setup for submodules as well, but the location of the git hooks directory will be under the parent's .git metadata folder.

For example, the platforms repository has the schemas-proto repository as a submodule in a Go vendor folder.

$GOPATH/src/github.com/ironnetcybersecurity/platforms/go/vendor/github.com/ironnetcybersecurity/schemas-proto

If I want to test updates to the proto code in my platforms code, I can update the submodule code and run make gen to have new Go bindings generated and ready to consume in platforms. Once I like my changes, I can prepare those proto updates for a PR by committing them to a branch in the schemas-proto submodule per our usual workflow. Remember, a submodule is still just a git repository, and can have hooks, remotes, branches, etc. So in order for my Go code to automatically be generated upon commit, I could install the pre-commit hook with the following relative path.

cd $GOPATH/src/github.com/ironnetcybersecurity/platforms/go/vendor/github.com/ironnetcybersecurity/schemas-proto
# Note that this path will be different depending on the location of your submodule within your parent repository.
ln -s codegen-precommit.sh ../../../../../.git/modules/go/vendor/github.com/ironnetcybersecurity/schemas-proto/hooks/pre-commit

Notice that the symlink is being placed deep down inside of the platforms repo's .git metadata folder, for the schemas-proto submodule. Now any commits I make within the schemas-proto submodule that have proto changes will automatically run make gen and include any code binding updates into the commit, even if the bindings happened to already be up-to-date.

Adding a new service?

If you're adding a new service, make sure it gets listed in the build.sh file under the SERVICES= definition.

This is what triggers our make gen to properly generate those files.

Modifying our Dockerfile.build-env?

If we are modifying the build-env Dockerfile, we need to make sure we test and push it, but also make sure our BUILD_ENV_HASH/VERSION files get updated and committed with the PR:

  1. Make Dockerfile_pb_{PROTOBUF_VERSION}.build-env changes.
  2. Generate new images with make build-env.
  3. Test the local image as needed w/ make gen-local. This will use the local images tagged with :build-pb-$(PROTOBUF_VERSION) that we built in the previous step.
  4. Push the image to our AWS ECR with make push-env. This step will update the BUILDENV_VERSION and BUILD_ENV_HASH_PB{PROTOBUF_VERSION} files. These files MUST be committed with your PR otherwise no one else will pull down the new Docker image in the future.
  5. Test the image as needed w/ make gen to confirm that the build images pulled from ECR work as expected.

gRPCurl and Protosets

We can use a tool called gRPCurl and protoset files to hit a gRPC service without the need to write a client. Think of gRPCurl as cURL for gRPC.

Installing gRPCurl

To install gRPCurl you must have go installed. Once you have go installed, run:

go get -u github.com/fullstorydev/grpcurl/cmd/grpcurl

What is a protoset file

A protoset file is describes the RPC schema for a gRPC. It is generated using the protoc command. A stand-alone protoset can be generated by including the --include_imports flag.

protoc --descriptor_set_out=<OUTPUT-PATH> --include_imports <PATH-TO-PROTO-FILE>

You can also generate protosets via the gen-protoset make target.

make gen-protoset

Using a Protoset File with gRPCurl

Now that we have generated our protoset files, gRPCurl can use them to display information about a service and its inputs, or send a request to service

Getting Information About the Service

grpcurl can retrieve information about a service or message using list and describe.

grpcurl list

grpcurl list lists all of the services defined in the protoset.

grpcurl -protoset protoset/services/threat/service/threat.protoset list

Output

service.ThreatService

If you want to list all of the methods associated with a service you can provide the service after the list verb.

grpcurl -protoset protoset/services/threat/service/threat.protoset list service.ThreatService

Output

AddEventToAlert
BulkCreateAlertWorkflows
CreateAlert
CreateAlertNote
CreateAlertWorkflow
CreateEvent
CreateExpertSystemRule
DeactivateAllExpertSystemRules
DeleteAlert
DeleteAlertNotes
GetAlert
...
grpcurl describe

grpcurl describe is similar to list, but it gives a description of each of the symbols. It can also be used get a description of a message. It is similar to list in that it will describe all of the services in the protoset file if nothing is provided after the describe verb.

grpcurl -protoset protoset/services/threat/service/threat.protoset list

Output

service.ThreatService is a service:
{
  "name": "ThreatService",
  "method": [
    {
      "name": "GetAnalytics",
      "inputType": ".GetAnalyticsRequest",
      "outputType": ".GetAnalyticsResponse",
      "options": {

      }
    },
    {
      "name": "RegisterAnalytic",
      "inputType": ".RegisterAnalyticRequest",
      "outputType": ".RegisterAnalyticResponse",
      "options": {

      }
    },
    {
      "name": "UpdateAnalytic",
      "inputType": ".UpdateAnalyticRequest",
      "outputType": ".UpdateAnalyticResponse",
      "options": {

      }
    },
    {
      "name": "GetComprehensiveReportData",
      "inputType": ".GetComprehensiveReportDataRequest",
      "outputType": ".GetComprehensiveReportDataResponse",
      "options": {

      }
    },
    {
      "name": "ValidateEvent",
      "inputType": ".ValidateEventRequest",
      "outputType": ".ValidateEventResponse",
      "options": {

      }
    },

    ...

  ],
  "options": {

  }
}

You can get information about a specific RPC symbol by including it after the describe verb.

grpcurl -protoset protoset/services/threat/service/threat.protoset describe
.ValidateEventRequest

Output

ValidateEventRequest is a message:
{
  "name": "ValidateEventRequest",
  "field": [
    {
      "name": "event",
      "number": 1,
      "label": "LABEL_OPTIONAL",
      "type": "TYPE_MESSAGE",
      "typeName": ".event.Event",
      "options": {

      },
      "jsonName": "event"
    }
  ],
  "options": {

  }
}

Send a Request with gRPCurl

You can send a request with gRPCurl using protoset file and passing in data. The data must be json encoded. Note that there is some weirdness when dealing with bytes in gRPC. You will need to transform your string into a byte array and base64 encode that array of bytes.

grpcurl \
    -protoset <PATH-TO-PROTOSET> \
    -rpc-header "<RPC-HEADERS>" \
    -d "<JSON-ENCODED-DATA>" \
    <SERVICE-ADDRESS> \
    <ENDPOINT-AS-DESCRIBED-IN-PROTOSET>

If you need help constructing the data for request you can use the -msg-template flag with the describe verb to output the json template at the bottom of the description.

grpc -protoset <PATH-TO-PROTOSET> -msg-template describe <RPC-SYMBOL>
Example Query

Let's use this information to try and hit the SiteConfigApiService gRPC service

Logging in

Hitting the SiteConfigApiService requires us to have a JWT. We must hit the login endpoint of the auth service to get our JWT

grpcurl \
    -protoset protoset/services/auth/service/auth.protoset \
    -d '{"user_name": "john@example.com", "password": "supersecret"}' \
    example.com:5678 \
    service.AuthService/Login

Output

{
  "token": "eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE1MzU3NjcwOTEsImlkIjoyNzQsInN0YWZmIjp0cnVlLCJ1c2VybmFtZSI6ImhlcnBhZGVycCJ9.Gvt6f8HLFbU9a2PyvZUD4BMtUCgXTcbPdBAhX5kIGCI"
}
Sending a Request to SiteConfigApiService

We are going to use the JWT from the previous step to authenticate ourselves to the service. We do this by adding the JWT to the rpc-header via the -rpc-header flag.

token="eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJleHAiOjE1MzU3NjcwOTEsImlkIjoyNzQsInN0YWZmIjp0cnVlLCJ1c2VybmFtZSI6ImhlcnBhZGVycCJ9.Gvt6f8HLFbU9a2PyvZUD4BMtUCgXTcbPdBAhX5kIGCI"

grpcurl \
    -protoset protoset/services/siteconfig/service/siteconfig.protoset \
    -rpc-header "authorization: Bearer $token" \
    -d '{ "cert_name": "vue" }' \
    site.example.com:8080 \
    service.SiteConfigApiService/GetCertificateCertInfo

Output

{
  "section": [
    {
      "sectionName": "Subject",
      "field": [
        {
          "fieldName": "countryName",
          "fieldValue": "US"
        },
        {
          "fieldName": "stateOrProvinceName",
          "fieldValue": "MD"
        },
        {
          "fieldName": "localityName",
          "fieldValue": "SomeTown"
        },
        {
          "fieldName": "organizationName",
          "fieldValue": "A company"
        },
        {
          "fieldName": "organizationalUnitName",
          "fieldValue": "Software"
        },
        {
          "fieldName": "commonName",
          "fieldValue": "localhost"
        }
      ]
    },

   ...

    {
      "sectionName": "CA Status",
      "field": [
        {
          "fieldName": "X509v3 CA",
          "fieldValue": "FALSE"
        },
        {
          "fieldName": "Self-Signed",
          "fieldValue": "TRUE"
        }
      ]
    }
  ]
}