Akto - API Security platform
Akto WebsiteStart freeBook a demoDiscordGitHub
  • Github Open Source Repo
  • What is Akto?
  • AktoGPT
  • AGENTIC AI
    • Akto MCP Server
  • Getting Started
    • Deployment Types
    • Akto Cloud
      • Connect Akto with Hybrid SaaS
      • Migrate From Self Hosted Setup To SaaS
      • Setting up proxy
    • Akto Self Hosted
      • AWS deploy
        • AWS multi-VPC deploy
        • AWS Cross-Region Cross-VPC deploy
        • Custom subdomain on Akto on AWS
      • Helm Deploy
      • Azure Deploy
      • Openshift Deploy
      • Heroku
      • GCP Deploy
    • Local Deploy
    • FAQs on data concerns
  • Traffic Connector
    • Traffic Data Sources
    • eBPF
      • Connect Akto with eBPF
      • Connect Akto with eBPF on mTLS
    • Kubernetes
      • Connect Akto with Kubernetes in AWS
      • Connect Akto eBPF with Kubernetes
    • API Gateways
      • Connect Akto with Envoy
      • Connect Akto with NGINX
      • Connect Akto with Istio
      • Connect Akto with HAProxy
      • Connect Akto with Azure API Management
      • Connect Akto with F5
      • Connect Akto with 3Scale
      • Connect Akto with Layer7 API Gateway
      • Connect Akto with Citrix
      • Connect Akto with Kong
      • Connect Akto with Kong Mesh
      • Connect Akto with Cloudflare
      • Connect Akto with IBM Connect
      • Connect Akto with Mulesoft Flex Gateway
      • Connect Akto with Apigee
    • Mirroring
      • Connect Akto with AWS Traffic Mirroring
      • Connect Akto with GCP Packet Mirroring
    • AWS Services
      • Connect Akto with AWS Beanstalk
      • Connect Akto with AWS API Gateway
      • Connect Akto with AWS Lambda
      • Connect Akto with AWS API Gateway with CloudWatch OAM
      • Connect Akto with AWS API Gateway with service account (Temporary Credentials)
      • Connect Akto with AWS Fargate
      • Connect Akto with AWS EKS
      • Connect Akto with AWS ECS
    • GCP Services
      • Connect Akto with GCP Packet Mirroring
      • Connect Akto with Apigee
      • Connect Akto with Google Cloud Run Functions
      • Connect Akto with Google Cloud Run
      • Connect Akto with GKE
    • Azure Services
      • Connect Akto with Azure App Services
      • Connect Akto with Azure API Management
      • Connect Akto with AKS
      • Connect Akto with Azure OpenShift
      • Connect Akto with Azure Container App
      • Connect Akto with Azure Functions
    • Akto SDK
    • Source Code
      • GitHub
      • Bitbucket
      • GitLab
      • API inventory from source code
      • Source code installation
    • Virtual Machines
      • Connect Akto with Docker
      • Connect Akto on TLS service
      • Connect Akto with TCP Agent
    • Manual
      • Connect Akto with Burp suite
      • Connect Akto with Postman
      • Connect Akto with OpenAPI
      • Add API traffic to Akto using HAR file upload
      • API Import: WSDL in Akto
    • Configure TLS on kafka
  • API Inventory
    • Concepts
      • API Endpoints
      • Meta Properties of API Endpoint
      • API Collection
      • API Call Stats
      • Explore mode
      • gRPC API Coverage with Akto
      • Data Types
      • API Groups
      • Environment Type
      • Protocol Support in Akto
      • API Changes
      • Third Party APIs
      • Tags
      • API Dependency Graph
      • Sensitive Data
      • Alerts
      • Shadow APIs
      • Zombie APIs
      • Risk Score
      • Auth types
      • Access Type
      • API discovery from source code
      • Advanced Filter Option
    • How-To
      • Enable Tree view for API collections
      • Export an API Collection to Postman
      • Export an API Collection to Burp
      • Create API group
      • Collection-Based RBAC
      • Descriptions for API Collections & Endpoints
      • Remove API(s) from API group
      • Deactivating and Reactivating API Collections in Akto
      • Add collection using Explore Mode
      • De-merge API
      • Create Swagger File Using Akto
      • Copy API Endpoints Data
      • Add an API Collection
      • Set environment type
      • Delete an API Collection
      • Create a Custom Data Type
      • Reset Data Types in Akto
      • Set Sensitivity of a Data Type
      • De-activate a data type
      • Add a Custom Auth Type
      • Reset an Auth Type
      • Configure Access Types
      • View New API Endpoint
      • Add Private CIDRs list
      • View New Parameters
      • Configure alerts on API changes
      • Create a custom collection
      • Redact sensitive data
      • Extract APIs from github hosted source code using our Github Action
      • Extract APIs from source code using our Docker based CLI
      • Remove Bad Endpoints
      • Create New Tags
      • Edit Tags
  • API Protection
    • Overview
    • External API Runtime Threat Notifications
    • Concepts
      • Threat Actors
      • Threat Policy
  • WAF
    • AWS WAF
    • Cloudflare WAF
  • Test Editor
    • Concepts
      • Overview
      • Test YAML
      • Test Library
      • Custom Test
      • Test YAML Syntax (Detailed)
        • ID
        • Info
        • Wordlists
        • Auth
        • API Selection Filters
        • Execute
        • Validation
        • Contexts
        • Strategy
        • Conditional flows
      • Template YAMLs
        • Local File Inclusion with Akto
      • Dynamic severity
    • How To
      • Edit Test
      • Create a Custom Test
      • Deactivate Test
      • Play in Test Editor Background
      • Copy Test Content
      • Opening Endpoint in Test Editor
      • Add a New Test Library
      • Contribute to Test Library
  • API Security Testing
    • Concepts
      • Severity Levels
      • Test
      • Result types
      • Test Role
      • User Config
      • Test Result
      • JSON Recording for Automated Auth Tokens
    • How To
      • Run Test
      • Auto-Create Jira Tickets
      • Edit Test Settings
      • Install testing module in your Cloud
        • Ephemeral Storage for Hybrid Runtime
        • gRPC Testing in Hybrid Testing Module
      • Create Custom Test Suites
      • Recalculate Issue Counts
      • Testing Module Selector in Akto
      • Run Tests by Category
      • Export Vulnerability Report from Test Results
      • Test Multiple APIs
      • Schedule Tests
      • Stop Tests
      • Run Test on Any One Endpoint
      • Configure global rate limit
      • Rescan Specific Issues Resolved
      • Configure Pre-request Script
      • Set Up JSON Recording for Auth Tokens
      • Create a Test Role
      • Edit Auth Flow in Test Roles
      • Restrict Access to a Test Role Using RBAC
      • Play in Test Editor Playground
      • Conduct Role-Based Testing
      • Run tests in CLI using Akto
      • Secure GraphQL APIs using Akto
      • Secure REST APIs using Akto
      • Secure SOAP APIs using Akto
      • Create and Edit Auth Types
  • Issues
    • Concepts
      • Overview
      • Values
      • Vulnerability Report
      • Remediation
    • How To
      • Jira Integration
      • Azure DevOps Boards Integration
      • Triage Issues
        • Review Issues Marked as False Positives
      • Export Selected Issues to Reports
      • Export Vulnerability Report
  • CI/CD
    • GitHub Actions
      • Create GitHub App
    • Jenkins
    • Azure DevOps
    • GitLab
    • Generic CI/CD
    • How To
      • Run tests in CI/CD
      • Add test to CI/CD Pipeline
      • Get API Credentials
      • Test ID from Akto test
  • Account
    • Invite User
      • Change role of a User
    • Create a New Account
    • How to Switch Between Accounts in Akto
    • Understanding Role Permissions
    • Custom roles
    • Audit Logs
    • SSO
      • Azure AD SAML
      • Okta OIDC
      • Github OIDC
      • Google Workspace SSO
      • Add Members From SSO
  • Compliance
    • Concepts
      • Overview
  • API security posture
    • Concepts
      • Overview
  • SIEM Integration
    • Splunk
  • Alerts
    • Slack Webhook
    • Microsoft Teams Webhook
    • Setup alerts for Akto test run results
  • Pricing
    • Pricing Plans
    • How To
      • Upgrade Your Plan
      • Downgrade Your Plan
      • Sync Usage Data
  • API reference
    • API reference
  • Components
    • Dashboard
    • Testing module
    • Traffic mirroring module
    • Runtime analyzer
    • Context analyzer
    • Puppeteer server
    • Other OSS
    • robots.txt
  • Troubleshooting
    • How to get logs
    • How to disable logging
    • How to update helm deployments
  • Stay Updated on New Releases
  • Support
Powered by GitBook
On this page
  • Resources
  • Prerequisites
  • Steps
  • Prepare Mongo Connection string
  • Install Akto via Helm
  • Verify Installation and harden security
  • If Akto Cluster is Deployed in a Separate Kubernetes Cluster

Was this helpful?

  1. Getting Started
  2. Akto Self Hosted

Helm Deploy

PreviousCustom subdomain on Akto on AWSNextAzure Deploy

Last updated 4 months ago

Was this helpful?

You can install Akto via Helm charts. Read .

Resources

Akto's Helm chart repo is on GitHub . You can also find Akto on Helm.sh .

Prerequisites

Please ensure you have the following -

  1. A Kubernetes cluster where you have deploy permissions

  2. helm command installed. Check

Steps

Here are the steps to install Akto via Helm charts -

  1. Prepare Mongo Connection string - You can create a fresh new Mongo or use existing Mongo if you have Akto installed previously in your cloud.

  2. Install Akto via Helm

  3. Verify Installation and harden security

Prepare Mongo Connection string

Akto Helm setup needs a Mongo connection string as input. It can come from either of the following -

  1. Your own Mongo Ensure your machine where you setup Mongo is NOT exposed to public internet. It shouldn't have a public IP. You can setup a Mongo cluster as follows: Create the following file: mongo-cluster-setup.yaml

---
apiVersion: storage.k8s.io/v1
kind: StorageClass
metadata:
  name: mongo-storage
provisioner: efs.csi.aws.com
parameters:
  provisioningMode: efs-ap
  fileSystemId: fs-0a64ff88e3f61684d # mention your fs id
  directoryPerms: "700"
  gidRangeStart: "1000"
  gidRangeEnd: "2000"
  basePath: "/akto1"
  # optional: specify access point
  # accessPointId: <your-access-point-id>
reclaimPolicy: Retain
volumeBindingMode: Immediate
---
apiVersion: v1
kind: Service
metadata:
  name: mongo
spec:
  ports:
  - port: 27017
    targetPort: 27017
  clusterIP: None
  selector:
    app: mongo
---
apiVersion: apps/v1
kind: StatefulSet
metadata:
  name: mongo
spec:
  selector:
    matchLabels:
      app: mongo
  serviceName: "mongo"
  replicas: 3
  template:
    metadata:
      labels:
        app: mongo
    spec:
      containers:
        - name: mongo
          image: mongo:6.0.1
          args: ["--dbpath", "/data/db"]
          startupProbe:
            exec:
              command:
                - mongosh
                - --eval
                - "db.adminCommand('ping')"
            initialDelaySeconds: 1
            periodSeconds: 10
            timeoutSeconds: 5
            successThreshold: 1
            failureThreshold: 2
          livenessProbe:
            exec:
              command:
                - mongosh
                - --eval
                - "db.adminCommand('ping')"
            initialDelaySeconds: 1
            periodSeconds: 10
            timeoutSeconds: 5
            successThreshold: 1
            failureThreshold: 2
          readinessProbe:
            exec:
              command:
                - mongosh
                - --eval
                - "db.adminCommand('ping')"
            initialDelaySeconds: 1
            periodSeconds: 10
            timeoutSeconds: 5
            successThreshold: 1
            failureThreshold: 2
          command:
            - mongod
            - "--bind_ip_all"
            - "--replSet"
            - rs0
          volumeMounts:
            - name: mongo-volume
              mountPath: /data/db
  volumeClaimTemplates:
    - metadata:
        name: mongo-volume
      spec:
        accessModes: ["ReadWriteOnce"]
        storageClassName: demo-storage
        resources:
          requests:
            storage: 1Gi

Now execute the following command: kubectl apply -f mongo-cluster-setup.yaml -n {namespace}

Wait for a couple of mins till you see 3 mongo pods with name: mongo-0, mongo-1 and mongo-2 are in running state Once the pods are in running state, execute the following commands to initialize the cluster:

kubectl exec -it mongo-0 mongosh -n {{namespace}}

# execute the next command from within mongo shell
rs.initiate({
    _id: "rs0",
    members: [
        {_id: 0, host:"mongo-0.mongo.default.svc.cluster.local:27017"},
        {_id: 1, host:"mongo-1.mongo.default.svc.cluster.local:27017"},
        {_id: 2, host:"mongo-2.mongo.default.svc.cluster.local:27017"}
    ]
})

The connection string would then be mongodb://mongo-0.mongo.default.svc.cluster.local:27017,mongo-1.mongo.default.svc.cluster.local:27017,mongo-2.mongo.default.svc.cluster.local:27017/admini

  1. Mongo Atlas You can use Mongo Atlas connection as well

    1. Go to Database Deployments page for your project

    2. Click on Connect button

    3. Choose Connect your application option

    4. Copy the connection string. It should look like mongodb://....

  2. AWS Document DB If you are on AWS, you can use AWS Document DB too. You can find the connection string on the Cluster page itself.

  3. Existing Akto setup If you have previously installed Akto via CloudFormation template, and you want to move to Helm, please execute the following steps. This guide should be used only if you are NOT using AWS Traffic Mirroring. If you are indeed using AWS Traffic Mirroring, please contact us at support@akto.io.

    1. Go to AWS > EC2 > Auto Scaling Groups and search for Akto.

    2. Edit all autoscaling groups and set min/max/desired to 0.

    3. This shuts down all existing Akto infra and just leaves Akto-Mongo running.

    4. [Optional - If you want to delete CloudFormation Stacks once migration completes] - We have to "clone" this Akto Mongo Instance. You can create an AMI and launch a new instance with the same AMI. Alternatively, you can also -

      • Go to AWS > EC2 > Instances > Search for "Akto Mongo instance". Launch a new instance using this template.

      • SSH on new Mongo and run sudo su - and then docker stop mongo.

      • Run rm -rf /akto/infra/data/ on new Mongo.

      • Copy /akto/infra/data/ from old Mongo instance to this new Mongo instance at the same directory location of /akto/infra/data/ using SCP

      • Run docker start mongo

    5. If you have installed Akto's K8s agent in your K8s cluster in the previous CloudFormation setup, please run kubectl delete -f akto-daemonset-config.yml to halt the traffic processing too.

    6. Once you setup Akto via Helm chart, try logging in with your previous credentials and check the data. All your data must be retained.

    7. Change the AKTO_NLB to the output of kubectl get services/flash-akto-runtime -n staging -o jsonpath="{.spec.clusterIP}"

    8. Run kubectl apply -f akto-daemonset-config.yml

    9. Confirm Akto dashboard has started receiving new data.

    10. Please Do Not Delete AWS CloudFormation Stacks. This will delete the Mongo Instance too and you'll lose the data. If you want to delete AWS CloudFormation stacks, please setup new a duplicate Mongo Instance from step (4). Use private IP of this new instance for step (6).

  4. This cfn template requires 2 inputs:

    1. PrivateSubnetId: Select the private subnet in which you want the cluster to be created. Make sure this subnet has a route to a Nat Gateway connectivity.

    2. KeyPair: This keypair will be used to ssh into the instance

    The default instance type in the template is m6a.large. You can change it as per your requirement in the template. We recommend not to use t3/t4 type of instances for running a cluster. Once this template is executed successfully you will see 3 EC2 instances created. You can access the connection url from the output section once the cfn execution completes Note: Please ensure your K8S cluster has connectivity to Mongo.

Install Akto via Helm

  1. Add Akto repo helm repo add akto https://akto-api-security.github.io/helm-charts

  2. Install Akto via helm helm install akto akto/akto -n dev --set mongo.aktoMongoConn="<AKTO_CONNECTION_STRING>"

  3. Run kubectl get pods -n <NAMESPACE> and verify you can see 4 pods

Verify Installation and harden security

  1. Run the following to get Akto dashboard url kubectl get services/akto-dashboard -n dev | awk -F " " '{print $4;}'

  2. Open Akto dashboard on port 8080. eg http://a54b36c1f4asdaasdfbd06a259de2-acf687643f6fe4eb.elb.ap-south-1.amazonaws.com:8080/

If Akto Cluster is Deployed in a Separate Kubernetes Cluster

If you encounter the error Can't connect to Kafka in your daemonset and you have exposed the Akto runtime service via a route that doesn't resemble *.svc.cluster.local, you'll need to update the KAFKA_ADVERTISED_LISTENERS environment variable in the akto-runtime deployment. Follow these steps:

  1. Change the KAFKA_ADVERTISED_LISTENERS environment variable to match your route using the following command: kubectl set env deployment/{deployment-name} KAFKA_ADVERTISED_LISTENERS="LISTENER_DOCKER_EXTERNAL_LOCALHOST://localhost:29092, LISTENER_DOCKER_EXTERNAL_DIFFHOST://{Service_Endpoint}:9092" -n {namespace}

  2. Verify the change with this command: kubectl get deployment {deployment-name} -o jsonpath="{.spec.template.spec.containers[?(@.name=='kafka1')].env[?(@.name=='KAFKA_ADVERTISED_LISTENERS')].value}" -n {namespace}

Replace {deployment-name}, {Service_Endpoint}, and {namespace} with your actual deployment name, service DNS, and namespace respectively.

Use the private ip of this Mongo instance while installing helm chart (refer section)

Mongo on K8s with Persistent volume You can setup a Mongo on K8s cluster itself with a Persistent volume. A sample template is provided . Use the IP of this service as Mongo private IP in section. If you are migrating from previous Akto installation, you have to bootstrap the persistent volume with original Mongo Instance's data before you start Mongo service.

Mongo cluster setup via cfn template Use the following cloud formation template

For good security measures, you should enable HTTPS by adding a certificate and put it behind a VPN. If you are on AWS, follow the guide .

announcement blog
here
here
here
link
here
Install Akto via Helm
here
Install Akto via Helm