# Connect Akto with Apigee

Apigee is Google Cloud's full-lifecycle API management platform that helps enterprises design, secure, and scale APIs. Integrating Apigee with Akto enables automatic discovery and security testing of all APIs managed through your Apigee gateway, providing comprehensive visibility and continuous security assessment of your API infrastructure.

<figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-62347e3a9b7afb20e555451a61a7a179d3ff5eb2%2Fimage.png?alt=media" alt=""><figcaption></figcaption></figure>

***

## Step 1: Deploy the Akto Data-Ingestion Service

Before setting up the Apigee connector, deploy the Akto Data-Ingestion Service by following these steps:

### 1.1 Download the Required Files

SSH into the instance where you want to deploy the data-ingestion service and run these commands:

```bash
wget https://raw.githubusercontent.com/akto-api-security/infra/refs/heads/feature/quick-setup/docker-compose-data-ingestion-runtime.yml
wget https://raw.githubusercontent.com/akto-api-security/infra/refs/heads/feature/quick-setup/data-ingestion-docker.env
wget https://raw.githubusercontent.com/akto-api-security/infra/refs/heads/feature/quick-setup/docker-mini-runtime.env
wget https://raw.githubusercontent.com/akto-api-security/infra/refs/heads/feature/quick-setup/watchtower.env

```

### 1.2 Retrieve the `DATABASE_ABSTRACTOR_SERVICE_TOKEN`

* Log in to the [Akto Dashboard](https://app.akto.io/).
* Navigate to the **Quick Start** tab in the left panel.

  <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-d152571ef3e9cab51c734c3ec917d0c81c2ea0f1%2FQuick-Start.png?alt=media" alt=""><figcaption></figcaption></figure>
* Select **Hybrid SaaS Connector** and copy the token from the **Runtime Service Command** section.

  <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-f8b4740b01beeb8983ce102f2ed925cb94f0d165%2FHybridSaaSConnector.png?alt=media" alt=""><figcaption></figcaption></figure>

### 1.3 Update the `docker-mini-runtime.env` File

* Open the `docker-mini-runtime.env` file and replace `token` with the `DATABASE_ABSTRACTOR_SERVICE_TOKEN` you retrieved earlier.

```plaintext
DATABASE_ABSTRACTOR_SERVICE_TOKEN=token
```

### 1.4 Deploy the Data-Ingestion Service

Run the following command to start the data-ingestion service:

```bash
docker-compose -f docker-compose-data-ingestion-runtime.yml up -d
```

### 1.5 Note the IP Address of the Data-Ingestion Service

Ensure the instance is accessible from the network where your Apigee API proxy is configured. Note the instance's IP address, as it will be required by the Apigee connector to send traffic data.

***

## Step 2: Configure Apigee to Use the Akto Data-Ingestion Service

You can choose either option below for Step 2:

* **Option A:** Manual setup from the GCP Apigee UI.
* **Option B:** Automated setup using Terraform scripts from Akto's infra repository.

Both options configure Akto ingestion in Apigee. Option B is recommended for repeatable CI/CD-friendly deployments.

### 2.1 Create or Choose an Apigee Environment

To configure the Akto connector, you need an **Intermediate** or **Comprehensive** environment in Apigee, as the JavaScript policy is not supported in the **Base** environment.

#### Steps to Create an Environment:

1. Log in to the [Apigee Management Console](https://console.cloud.google.com/apigee/overview).
2. Navigate to **Management → Environments** from the left-side navigation bar.

   <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-945768deb250b2e018a93a43ed50517c875a2435%2Fcreate-env_apigee.png?alt=media" alt=""><figcaption></figcaption></figure>
3. Click **+ Create Environment**.

   <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-7f417055b82506f825b255e4a10a7678f923f259%2Fcreate_env_instance_apigee.png?alt=media" alt=""><figcaption></figcaption></figure>
4. Provide the required details:
   * **Name**: Specify a name for your environment.
   * **Environment Type**: Choose **Intermediate** or **Comprehensive**.
5. Click **Create** to finalize your environment setup.

   <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-8f2bd277b9964e89fcf09b0c34caf9601bb00802%2Fapigee_env_details.png?alt=media" alt=""><figcaption></figcaption></figure>

If you already have an **Intermediate** or **Comprehensive** environment, you can skip this step and proceed to the next section.

### 2.2 Option A: Manual Setup from GCP UI (Shared Flow + Flow Hook)

This is the manual environment-wide setup.

1. In Apigee, go to **Proxy development → Shared Flows** and click **+ Create**.

   <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-06f502ad198ccf2341ca2b5090a25deb959308b4%2Fapigee_shared_flow.png?alt=media" alt=""><figcaption></figcaption></figure>
2. Create a shared flow (for example: `akto-traffic-collector`).
3. Open the shared flow, go to **Develop → default**, and add two Steps in order: first `AktoJavascript`, then `ML-SendAktoTcpSyslog`.
4. In the same shared flow, click **Policies +** and add a **JavaScript** policy named `AktoJavascript`.

   <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-29a7afa2a177cafb0f8893d72d90fb4dbc4ae8af%2Fapigee_shared_flow_javascript.png?alt=media" alt=""><figcaption></figcaption></figure>
5. Open the JavaScript policy XML (Under Policies section) and set it as:

```xml
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<Javascript continueOnError="true" enabled="true" timeLimit="1000" name="AktoJavascript">
  <DisplayName>AktoJavascript</DisplayName>
  <Properties/>
  <ResourceURL>jsc://AktoJavascript.js</ResourceURL>
</Javascript>
```

6. Create a JS resource file named `AktoJavascript.js` and paste the script below.
7. Click **Policies +** again and add a **MessageLogging** policy named `ML-SendAktoTcpSyslog`. Set the policy XML as:

```xml
<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
<MessageLogging name="ML-SendAktoTcpSyslog" continueOnError="true" enabled="true">
  <DisplayName>ML-SendAktoTcpSyslog</DisplayName>
  <Syslog>
    <Message>{akto.log.payload}</Message>
    <Host>YOUR_DATA_INGESTION_SERVICE_IP</Host>
    <Port>5140</Port>
    <Protocol>TCP</Protocol>
    <FormatMessage>false</FormatMessage>
  </Syslog>
</MessageLogging>
```

8. Save and deploy the shared flow to your target environment.

   <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-ba60fb28e559d000be263784c8cf5e196943929d%2Fapigee_deploy_sharedflow.png?alt=media" alt=""><figcaption></figcaption></figure>
9. Go to **Management → Environments → your\_environment → Flow Hooks**.
10. Attach the shared flow to a hook point (recommended: `PostProxyFlowHook`).

    <figure><img src="https://2916937215-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FRc4KTKGprZI2sPWKoaLe%2Fuploads%2Fgit-blob-7007236b09e67dfce945ded8e9a67a81a777c26d%2Fapigee_attach_sharedflow.png?alt=media" alt=""><figcaption></figcaption></figure>

```javascript
var requestPath = context.getVariable("request.uri");
var queryString = context.getVariable("request.querystring");
var requestHeaders = context.getVariable("request.headers.names");
var requestPayload = context.getVariable("request.content");
var clientIp = context.getVariable("request.header.x-forwarded-for");
var method = context.getVariable("request.verb");

var responseHeaders = context.getVariable("response.headers.names");
var responsePayload = context.getVariable("response.content");
var statusCode = context.getVariable("response.status.code");
var statusText = context.getVariable("response.reason.phrase") || "OK";

var rawTime = context.getVariable("system.timestamp");
var epochTime = Math.floor(rawTime / 1000);

var requestHeadersRes = {};
requestHeaders = (requestHeaders + '').slice(1, -1).split(', ');
requestHeaders.forEach(function(x) {
  requestHeadersRes[x] = context.getVariable("request.header." + x);
});

var responseHeadersRes = {};
responseHeaders = (responseHeaders + '').slice(1, -1).split(', ');
responseHeaders.forEach(function(x) {
  responseHeadersRes[x] = context.getVariable("response.header." + x);
});

var payload = {
    batchData: [{
        path: requestPath + (queryString ? "?" + queryString : ""),
        requestHeaders: JSON.stringify(requestHeadersRes),
        responseHeaders: JSON.stringify(responseHeadersRes),
        method: method,
        requestPayload: requestPayload || "",
        responsePayload: responsePayload || "",
        ip: clientIp || "0.0.0.0",
        time: "" + epochTime,
        statusCode: "" + statusCode,
        type: "HTTP/1.1",
        status: statusText,
        akto_account_id: "1000000",
        akto_vxlan_id: "0",
        is_pending: "false",
        source: "MIRRORING"
    }]
};

context.setVariable("akto.log.payload", JSON.stringify(payload));
```

Important policy behavior:

* Both `AktoJavascript` and `ML-SendAktoTcpSyslog` must have `continueOnError="true"`.
* Both policies must be added as Steps in the shared flow **default** section in order: JS first, then MessageLogging.
* Replace `YOUR_DATA_INGESTION_SERVICE_IP` in the MessageLogging policy with the IP noted in Step 1.5.

### 2.3 Option B: Terraform Automation

Use Terraform from:

* **Repository:** <https://github.com/akto-api-security/infra>
* **Branch:** `feature/quick-setup`
* **Folder:** `apigee-connect-terraform`

1. Clone and switch to the required branch:

```bash
git clone https://github.com/akto-api-security/infra.git
cd infra
git checkout feature/quick-setup
cd apigee-connect-terraform
```

2. Provide the required values in a `terraform.tfvars` file inside `apigee-connect-terraform`.

If the repository contains `terraform.tfvars.example`, copy it first:

```bash
cp terraform.tfvars.example terraform.tfvars
```

Otherwise create `terraform.tfvars` manually with:

```hcl
gcp_project_id             = "your-gcp-project-id"
apigee_environment         = "your-apigee-environment-name"
data_ingestion_service_url = "your-data-ingestion-service-ip:5140"
```

3. Run Terraform:

```bash
terraform init
terraform apply -var-file="terraform.tfvars"
```

This automation creates and deploys the Apigee shared flow and attaches it to the selected environment flow hook.

### 2.4 Test the Integration

* Send test API traffic through Apigee.
* Verify in the Akto dashboard that traffic is being ingested.

***

### Get Support for your Akto setup

There are multiple ways to request support from Akto. We are 24X7 available on the following:

1. In-app `intercom` support. Message us with your query on intercom in Akto dashboard and someone will reply.
2. Join our [discord channel](https://www.akto.io/community) for community support.
3. Contact `help@akto.io` for email support.
4. Contact us [here](https://www.akto.io/contact-us).


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.akto.io/traffic-connector/gcp-services/connect-akto-with-apigee.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
