Akto - API Security platform
Akto WebsiteStart freeBook a demoDiscordGitHub
  • Github Open Source Repo
  • What is Akto?
  • AktoGPT
  • AGENTIC AI
    • Akto MCP Server
  • Getting Started
    • Deployment Types
    • Akto Cloud
      • Connect Akto with Hybrid SaaS
      • Migrate From Self Hosted Setup To SaaS
      • Setting up proxy
    • Akto Self Hosted
      • AWS deploy
        • AWS multi-VPC deploy
        • AWS Cross-Region Cross-VPC deploy
        • Custom subdomain on Akto on AWS
      • Helm Deploy
      • Azure Deploy
      • Openshift Deploy
      • Heroku
      • GCP Deploy
    • Local Deploy
    • FAQs on data concerns
  • Traffic Connector
    • Traffic Data Sources
    • eBPF
      • Connect Akto with eBPF
      • Connect Akto with eBPF on mTLS
    • Kubernetes
      • Connect Akto with Kubernetes in AWS
      • Connect Akto eBPF with Kubernetes
    • API Gateways
      • Connect Akto with Envoy
      • Connect Akto with NGINX
      • Connect Akto with Istio
      • Connect Akto with HAProxy
      • Connect Akto with Azure API Management
      • Connect Akto with F5
      • Connect Akto with 3Scale
      • Connect Akto with Layer7 API Gateway
      • Connect Akto with Citrix
      • Connect Akto with Kong
      • Connect Akto with Kong Mesh
      • Connect Akto with Cloudflare
      • Connect Akto with IBM Connect
      • Connect Akto with Mulesoft Flex Gateway
      • Connect Akto with Apigee
    • Mirroring
      • Connect Akto with AWS Traffic Mirroring
      • Connect Akto with GCP Packet Mirroring
    • AWS Services
      • Connect Akto with AWS Beanstalk
      • Connect Akto with AWS API Gateway
      • Connect Akto with AWS Lambda
      • Connect Akto with AWS API Gateway with CloudWatch OAM
      • Connect Akto with AWS API Gateway with service account (Temporary Credentials)
      • Connect Akto with AWS Fargate
      • Connect Akto with AWS EKS
      • Connect Akto with AWS ECS
    • GCP Services
      • Connect Akto with GCP Packet Mirroring
      • Connect Akto with Apigee
      • Connect Akto with Google Cloud Run Functions
      • Connect Akto with Google Cloud Run
      • Connect Akto with GKE
    • Azure Services
      • Connect Akto with Azure App Services
      • Connect Akto with Azure API Management
      • Connect Akto with AKS
      • Connect Akto with Azure OpenShift
      • Connect Akto with Azure Container App
      • Connect Akto with Azure Functions
    • Akto SDK
    • Source Code
      • GitHub
      • Bitbucket
      • GitLab
      • API inventory from source code
      • Source code installation
    • Virtual Machines
      • Connect Akto with Docker
      • Connect Akto on TLS service
      • Connect Akto with TCP Agent
    • Manual
      • Connect Akto with Burp suite
      • Connect Akto with Postman
      • Connect Akto with OpenAPI
      • Add API traffic to Akto using HAR file upload
      • API Import: WSDL in Akto
    • Configure TLS on kafka
  • API Inventory
    • Concepts
      • API Endpoints
      • Meta Properties of API Endpoint
      • API Collection
      • API Call Stats
      • Explore mode
      • gRPC API Coverage with Akto
      • Data Types
      • API Groups
      • Environment Type
      • Protocol Support in Akto
      • API Changes
      • Third Party APIs
      • Tags
      • API Dependency Graph
      • Sensitive Data
      • Alerts
      • Shadow APIs
      • Zombie APIs
      • Risk Score
      • Auth types
      • Access Type
      • API discovery from source code
      • Advanced Filter Option
    • How-To
      • Enable Tree view for API collections
      • Export an API Collection to Postman
      • Export an API Collection to Burp
      • Create API group
      • Collection-Based RBAC
      • Descriptions for API Collections & Endpoints
      • Remove API(s) from API group
      • Deactivating and Reactivating API Collections in Akto
      • Add collection using Explore Mode
      • De-merge API
      • Create Swagger File Using Akto
      • Copy API Endpoints Data
      • Add an API Collection
      • Set environment type
      • Delete an API Collection
      • Create a Custom Data Type
      • Reset Data Types in Akto
      • Set Sensitivity of a Data Type
      • De-activate a data type
      • Add a Custom Auth Type
      • Reset an Auth Type
      • Configure Access Types
      • View New API Endpoint
      • Add Private CIDRs list
      • View New Parameters
      • Configure alerts on API changes
      • Create a custom collection
      • Redact sensitive data
      • Extract APIs from github hosted source code using our Github Action
      • Extract APIs from source code using our Docker based CLI
      • Remove Bad Endpoints
      • Create New Tags
      • Edit Tags
  • API Protection
    • Overview
    • External API Runtime Threat Notifications
    • Concepts
      • Threat Actors
      • Threat Policy
  • WAF
    • AWS WAF
    • Cloudflare WAF
  • Test Editor
    • Concepts
      • Overview
      • Test YAML
      • Test Library
      • Custom Test
      • Test YAML Syntax (Detailed)
        • ID
        • Info
        • Wordlists
        • Auth
        • API Selection Filters
        • Execute
        • Validation
        • Contexts
        • Strategy
        • Conditional flows
      • Template YAMLs
        • Local File Inclusion with Akto
      • Dynamic severity
    • How To
      • Edit Test
      • Create a Custom Test
      • Deactivate Test
      • Play in Test Editor Background
      • Copy Test Content
      • Opening Endpoint in Test Editor
      • Add a New Test Library
      • Contribute to Test Library
  • API Security Testing
    • Concepts
      • Severity Levels
      • Test
      • Result types
      • Test Role
      • User Config
      • Test Result
      • JSON Recording for Automated Auth Tokens
    • How To
      • Run Test
      • Auto-Create Jira Tickets
      • Edit Test Settings
      • Install testing module in your Cloud
        • Ephemeral Storage for Hybrid Runtime
        • gRPC Testing in Hybrid Testing Module
      • Create Custom Test Suites
      • Recalculate Issue Counts
      • Testing Module Selector in Akto
      • Run Tests by Category
      • Export Vulnerability Report from Test Results
      • Test Multiple APIs
      • Schedule Tests
      • Stop Tests
      • Run Test on Any One Endpoint
      • Configure global rate limit
      • Rescan Specific Issues Resolved
      • Configure Pre-request Script
      • Set Up JSON Recording for Auth Tokens
      • Create a Test Role
      • Edit Auth Flow in Test Roles
      • Restrict Access to a Test Role Using RBAC
      • Play in Test Editor Playground
      • Conduct Role-Based Testing
      • Run tests in CLI using Akto
      • Secure GraphQL APIs using Akto
      • Secure REST APIs using Akto
      • Secure SOAP APIs using Akto
      • Create and Edit Auth Types
  • Issues
    • Concepts
      • Overview
      • Values
      • Vulnerability Report
      • Remediation
    • How To
      • Jira Integration
      • Azure DevOps Boards Integration
      • Triage Issues
        • Review Issues Marked as False Positives
      • Export Selected Issues to Reports
      • Export Vulnerability Report
  • CI/CD
    • GitHub Actions
      • Create GitHub App
    • Jenkins
    • Azure DevOps
    • GitLab
    • Generic CI/CD
    • How To
      • Run tests in CI/CD
      • Add test to CI/CD Pipeline
      • Get API Credentials
      • Test ID from Akto test
  • Account
    • Invite User
      • Change role of a User
    • Create a New Account
    • How to Switch Between Accounts in Akto
    • Understanding Role Permissions
    • Custom roles
    • Audit Logs
    • SSO
      • Azure AD SAML
      • Okta OIDC
      • Github OIDC
      • Google Workspace SSO
      • Add Members From SSO
  • Compliance
    • Concepts
      • Overview
  • API security posture
    • Concepts
      • Overview
  • SIEM Integration
    • Splunk
  • Alerts
    • Slack Webhook
    • Microsoft Teams Webhook
    • Setup alerts for Akto test run results
  • Pricing
    • Pricing Plans
    • How To
      • Upgrade Your Plan
      • Downgrade Your Plan
      • Sync Usage Data
  • API reference
    • API reference
  • Components
    • Dashboard
    • Testing module
    • Traffic mirroring module
    • Runtime analyzer
    • Context analyzer
    • Puppeteer server
    • Other OSS
    • robots.txt
  • Troubleshooting
    • How to get logs
    • How to disable logging
    • How to update helm deployments
  • Stay Updated on New Releases
  • Support
Powered by GitBook
On this page
  • Step 1: Configure Akto Traffic Processor
  • Step 2: Add NGINX module

Was this helpful?

  1. Traffic Connector
  2. API Gateways

Connect Akto with NGINX

PreviousConnect Akto with EnvoyNextConnect Akto with Istio

Last updated 2 months ago

Was this helpful?

If your API calls are being routed through NGINX, you can use Akto's NGINX module to send traffic to Akto dashboard. Below guide will help you do this:

Step 1: Configure Akto Traffic Processor

Step 2: Add NGINX module

This methods is recommended when you have end to end TLS and SSL termination happens at NGINX.

The Akto nginx module uses the dynamic module functionality supported by nginx. This requires nginx to be build from source for which the exact steps can be slightly varied depending on the linux flavour, the core process though, remains the same.

Note: For AKTO_NLB_IP in below configurations, use the value of the mini-runtime service we deployed in step 1.

Ubuntu / Debian based
  1. Record all API calls using nginx-module-njs. (njs is a standard NGINX module built and shipped in every release of NGINX). You can install it by running apt install nginx-module-njs

  2. In your NGINX conf file - /etc/nginx/nginx.conf , add the following:

load_module /usr/lib/nginx/modules/ngx_http_js_module.so;
load_module /usr/lib/nginx/modules/ngx_http_kafka_log_module.so;

add the following lines in http section of /etc/nginx/nginx.conf:

subrequest_output_buffer_size 8k;
js_path "/etc/nginx/njs/";
js_var $responseBo "{}";
js_import main2 from api_log.js;
kafka_log_kafka_brokers <AKTO_NLB_IP>:9092;
kafka_log_kafka_buffer_max_messages 100000;

5. In /etc/nginx/conf.d/default.conf, add 2 lines in server > location section

server {
    location / {
        .....
        js_body_filter main2.to_lower_case buffer_type=buffer;
		kafka_log kafka:akto.api.logs $responseBo;
    }
}

6. Restart NGINX by nginx -s reload. This will start logging all the request-response logs to akto.

Amazon linux 2
  1. sudo su -

  2. To set up the yum repository for Amazon Linux 2 for nginx, create the file named /etc/yum.repos.d/nginx.repo with the following content. This is needed to install nginx (if not present) and nginx-module-njs.

    [nginx-stable]
    name=nginx stable repo
    baseurl=http://nginx.org/packages/amzn2/$releasever/$basearch/
    gpgcheck=1
    enabled=1
    gpgkey=https://nginx.org/keys/nginx_signing.key
    module_hotfixes=true
    priority=9
    
    [nginx-mainline]
    name=nginx mainline repo
    baseurl=http://nginx.org/packages/mainline/amzn2/$releasever/$basearch/
    gpgcheck=1
    enabled=0
    gpgkey=https://nginx.org/keys/nginx_signing.key
    module_hotfixes=true
    priority=9
  3. If nginx is not present install it using yum install nginx else you can skip this step.

  4. Check your nginx version using nginx -v and download/extract the source for the same using the following commands.

wget http://nginx.org/download/nginx-{version}.tar.gz
tar -zxvf nginx-{version}.tar.gz

e.g.

wget http://nginx.org/download/nginx-1.26.0.tar.gz
tar -zxvf nginx-1.26.0.tar.gz
# Enable EPEL repository if not already enabled
amazon-linux-extras install epel -y
# Install librdkafka and its development package
yum install librdkafka librdkafka-devel -y
yum install pcre pcre-devel -y
yum groupinstall "Development Tools" -y
# go to nginx directory, which we downloaded in step 3
cd nginx-1.26.0/
./configure --with-compat --add-dynamic-module=../nginx-kafka-log-module --with-cc-opt="-I/usr/include" --with-ld-opt="-L/usr/lib"
make modules
cp objs/ngx_http_kafka_log_module.so /etc/nginx/modules/
  1. Add the Akto njs code to nginx njs directory using the following commands.

mkdir /etc/nginx/njs
curl -o /etc/nginx/njs/api_log.js https://raw.githubusercontent.com/akto-api-security/nginx-middleware/master/api_log.js
  1. To configure nginx, in your nginx configuration file ( /etc/nginx/nginx.conf ), add the following lines to top:

load_module /etc/nginx/modules/ngx_http_js_module.so;
load_module /etc/nginx/modules/ngx_http_kafka_log_module.so;
  1. Also add this in http section of /etc/nginx/nginx.conf. Replace the AKTO_NLB_IP, with the one you obtained in setting up data processors.

subrequest_output_buffer_size 8k;
js_path "/etc/nginx/njs/";
js_var $responseBo "{}";
js_import main2 from api_log.js;
kafka_log_kafka_brokers "<AKTO_NLB_IP>:9092";
kafka_log_kafka_buffer_max_messages 100000;
  1. Add this to .conf [ You can get the path of this file in the include section of /etc/nginx/nginx.conf file ]. Make sure that the traffic here is being proxied/sent to your actual application.

location / {
    js_body_filter main2.to_lower_case buffer_type=buffer;
    kafka_log kafka:akto.api.logs $responseBo;
    ......
}
  1. nginx -s reload [ Use this command if nginx is already running, else use : systemctl start nginx ]

Amazon linux 2023
  1. sudo su -

  2. To set up the yum repository for Amazon Linux 2023 for nginx, create the file named /etc/yum.repos.d/nginx.repo with the following content. This is needed to install nginx (if not present) and nginx-module-njs.

    [nginx-stable]
    name=nginx stable repo
    baseurl=http://nginx.org/packages/amzn/2023/$basearch/
    gpgcheck=1
    enabled=1
    gpgkey=https://nginx.org/keys/nginx_signing.key
    module_hotfixes=true
    priority=9
    
    [nginx-mainline]
    name=nginx mainline repo
    baseurl=http://nginx.org/packages/mainline/amzn/2023/$basearch/
    gpgcheck=1
    enabled=0
    gpgkey=https://nginx.org/keys/nginx_signing.key
    module_hotfixes=true
    priority=9
  3. If nginx is not present install it using yum install nginx -y else you can skip this step.

  4. Check your nginx version using nginx -v and download/extract the source for the same using the following commands.

wget http://nginx.org/download/nginx-{version}.tar.gz
tar -zxvf nginx-{version}.tar.gz

e.g.

wget http://nginx.org/download/nginx-1.26.0.tar.gz
tar -zxvf nginx-1.26.0.tar.gz
  1. i. To set up the yum repository for Amazon Linux 2023 for confluent, create the file named /etc/yum.repos.d/confluent.repo with the following content.

    [Confluent-Clients]
    name=Confluent Clients repository
    baseurl=https://packages.confluent.io/clients/rpm/centos/9/$basearch
    gpgcheck=1
    gpgkey=https://packages.confluent.io/clients/rpm/archive.key
    enabled=1

    ii. Run the following commands:

    yum install librdkafka1 librdkafka-devel -y
    yum install pcre pcre-devel -y
    yum groupinstall "Development Tools" -y
    # go to nginx directory, which we downloaded in step 3
    cd nginx-1.26.0/
    ./configure --with-compat --add-dynamic-module=../nginx-kafka-log-module --with-cc-opt="-I/usr/include" --with-ld-opt="-L/usr/lib"
    make modules
    cp objs/ngx_http_kafka_log_module.so /etc/nginx/modules/
  2. Add the Akto njs code to nginx njs directory using the following commands.

mkdir /etc/nginx/njs
curl -o /etc/nginx/njs/api_log.js https://raw.githubusercontent.com/akto-api-security/nginx-middleware/master/api_log.js
  1. To configure nginx, in your nginx configuration file ( /etc/nginx/nginx.conf ), add the following lines to top:

load_module /etc/nginx/modules/ngx_http_js_module.so;
load_module /etc/nginx/modules/ngx_http_kafka_log_module.so;
  1. Also add this in http section of /etc/nginx/nginx.conf. Replace the AKTO_NLB_IP, with the one you obtained in setting up data processors.

subrequest_output_buffer_size 8k;
js_path "/etc/nginx/njs/";
js_var $responseBo "{}";
js_import main2 from api_log.js;
kafka_log_kafka_brokers "<AKTO_NLB_IP>:9092";
kafka_log_kafka_buffer_max_messages 100000;
  1. Add this to .conf [ You can get the path of this file in the include section of /etc/nginx/nginx.conf file ]. Make sure that the traffic here is being proxied/sent to your actual application.

location / {
    js_body_filter main2.to_lower_case buffer_type=buffer;
    kafka_log kafka:akto.api.logs $responseBo;
    ......
}
  1. nginx -s reload [ Use this command if nginx is already running, else use : systemctl start nginx ]

Note: We have benchmarked an nginx server with and without akto nginx traffic module. The results for the same are as follows:

metrics
vanilla nginx
nginx with akto module

avg. cpu usage

upto 36%

upto 38%

avg. memory usage

0.5%

0.5%

The server setup being used is an AWS EC2 (t3a.small: 2CPU + 2GB RAM), with around 1600-1800 requests being fired per second to the server continuously for over a minute (~110k requests per minute). Here nginx is configured as a reverse proxy to a node.js backend server.

Set up and configure Akto Traffic Processor. The steps are mentioned .

The data is sent to Akto installed in your VPC using . You can install it by using nginx dynamic modules functionality as described

Download the and save as /etc/nginx/njs/api_log.js

Install nginx-module-njs using yum install nginx-module-njs ( In case of any problem, please refer to the )

We will send data to Akto traffic processor using . To clone it run: git clone https://github.com/kaltura/nginx-kafka-log-module.git

We can install nginx-kafka-log-module using the steps below. For the official nginx docs to install nginx dynamic modules refer .

Install nginx-module-njs using yum install nginx-module-njs ( In case of any problem, please refer to the )

We will send data to Akto traffic processor using . To clone it run: git clone https://github.com/kaltura/nginx-kafka-log-module.git

We can install nginx-kafka-log-module using the steps below. For the official nginx docs to install nginx dynamic modules refer .

here
nginx-kafka-log-module
here
js file
official nginx docs to install nginx-module-njs
nginx-kafka-log-module
this
official nginx docs to install nginx-module-njs
nginx-kafka-log-module
this