Akto - API Security platform
Akto WebsiteStart freeBook a demoDiscordGitHub
  • Github Open Source Repo
  • What is Akto?
  • AktoGPT
  • AGENTIC AI
    • Akto MCP Server
  • Getting Started
    • Deployment Types
    • Akto Cloud
      • Connect Akto with Hybrid SaaS
      • Migrate From Self Hosted Setup To SaaS
      • Setting up proxy
    • Akto Self Hosted
      • AWS deploy
        • AWS multi-VPC deploy
        • AWS Cross-Region Cross-VPC deploy
        • Custom subdomain on Akto on AWS
      • Helm Deploy
      • Azure Deploy
      • Openshift Deploy
      • Heroku
      • GCP Deploy
    • Local Deploy
    • FAQs on data concerns
  • Traffic Connector
    • Traffic Data Sources
    • eBPF
      • Connect Akto with eBPF
      • Connect Akto with eBPF on mTLS
    • Kubernetes
      • Connect Akto with Kubernetes in AWS
      • Connect Akto eBPF with Kubernetes
    • API Gateways
      • Connect Akto with Envoy
      • Connect Akto with NGINX
      • Connect Akto with Istio
      • Connect Akto with HAProxy
      • Connect Akto with Azure API Management
      • Connect Akto with F5
      • Connect Akto with 3Scale
      • Connect Akto with Layer7 API Gateway
      • Connect Akto with Citrix
      • Connect Akto with Kong
      • Connect Akto with Kong Mesh
      • Connect Akto with Cloudflare
      • Connect Akto with IBM Connect
      • Connect Akto with Mulesoft Flex Gateway
      • Connect Akto with Apigee
    • Mirroring
      • Connect Akto with AWS Traffic Mirroring
      • Connect Akto with GCP Packet Mirroring
    • AWS Services
      • Connect Akto with AWS Beanstalk
      • Connect Akto with AWS API Gateway
      • Connect Akto with AWS Lambda
      • Connect Akto with AWS API Gateway with CloudWatch OAM
      • Connect Akto with AWS API Gateway with service account (Temporary Credentials)
      • Connect Akto with AWS Fargate
      • Connect Akto with AWS EKS
      • Connect Akto with AWS ECS
    • GCP Services
      • Connect Akto with GCP Packet Mirroring
      • Connect Akto with Apigee
      • Connect Akto with Google Cloud Run Functions
      • Connect Akto with Google Cloud Run
      • Connect Akto with GKE
    • Azure Services
      • Connect Akto with Azure App Services
      • Connect Akto with Azure API Management
      • Connect Akto with AKS
      • Connect Akto with Azure OpenShift
      • Connect Akto with Azure Container App
      • Connect Akto with Azure Functions
    • Akto SDK
    • Source Code
      • GitHub
      • Bitbucket
      • GitLab
      • API inventory from source code
      • Source code installation
    • Virtual Machines
      • Connect Akto with Docker
      • Connect Akto on TLS service
      • Connect Akto with TCP Agent
    • Manual
      • Connect Akto with Burp suite
      • Connect Akto with Postman
      • Connect Akto with OpenAPI
      • Add API traffic to Akto using HAR file upload
      • API Import: WSDL in Akto
    • Configure TLS on kafka
  • API Inventory
    • Concepts
      • API Endpoints
      • Meta Properties of API Endpoint
      • API Collection
      • Explore mode
      • Data Types
      • API Groups
      • Environment Type
      • Protocol Support in Akto
      • API Changes
      • Third Party APIs
      • Tags
      • API Dependency Graph
      • Sensitive Data
      • Alerts
      • Shadow APIs
      • Zombie APIs
      • Risk Score
      • Auth types
      • Access Type
      • API discovery from source code
      • Advanced Filter Option
    • How-To
      • Enable Tree view for API collections
      • Export an API Collection to Postman
      • Export an API Collection to Burp
      • Create API group
      • Collection-Based RBAC
      • Descriptions for API Collections & Endpoints
      • Remove API(s) from API group
      • Deactivate an API Collection
      • Add collection using Explore Mode
      • De-merge API
      • Create Swagger File Using Akto
      • Copy API Endpoints Data
      • Add an API Collection
      • Set environment type
      • Delete an API Collection
      • Create a Custom Data Type
      • Reset Data Types in Akto
      • Set Sensitivity of a Data Type
      • De-activate a data type
      • Add a Custom Auth Type
      • Reset an Auth Type
      • Configure Access Types
      • View New API Endpoint
      • Add Private CIDRs list
      • View New Parameters
      • Configure alerts on API changes
      • Create a custom collection
      • Redact sensitive data
      • Extract APIs from github hosted source code using our Github Action
      • Extract APIs from source code using our Docker based CLI
      • Remove Bad Endpoints
      • Create New Tags
      • Edit Tags
  • API Protection
    • Overview
    • Concepts
      • Threat Policy
  • WAF
    • AWS WAF
    • Cloudflare WAF
  • Test Editor
    • Concepts
      • Overview
      • Test YAML
      • Test Library
      • Custom Test
      • Test YAML Syntax (Detailed)
        • ID
        • Info
        • Wordlists
        • Auth
        • API Selection Filters
        • Execute
        • Validation
        • Contexts
        • Strategy
        • Conditional flows
      • Template YAMLs
        • Local File Inclusion with Akto
      • Dynamic severity
    • How To
      • Edit Test
      • Create a Custom Test
      • Deactivate Test
      • Play in Test Editor Background
      • Copy Test Content
      • Opening Endpoint in Test Editor
      • Add a New Test Library
      • Contribute to Test Library
  • API Security Testing
    • Concepts
      • Severity Levels
      • Test
      • Result types
      • Test Role
      • User Config
      • Test Result
      • JSON Recording for Automated Auth Tokens
    • How To
      • Run Test
      • Auto-Create Jira Tickets
      • Edit Test Settings
      • Install testing module in your Cloud
        • Ephemeral Storage for Hybrid Runtime
        • gRPC Testing in Hybrid Testing Module
      • Create Custom Test Suites
      • Recalculate Issue Counts
      • Testing Module Selector in Akto
      • Run Tests by Category
      • Export Vulnerability Report from Test Results
      • Test Multiple APIs
      • Schedule Tests
      • Stop Tests
      • Run Test on Any One Endpoint
      • Configure global rate limit
      • Rescan Specific Issues Resolved
      • Configure Pre-request Script
      • Set Up JSON Recording for Auth Tokens
      • Create a Test Role
      • Edit Auth Flow in Test Roles
      • Restrict Access to a Test Role Using RBAC
      • Play in Test Editor Playground
      • Conduct Role-Based Testing
      • Run tests in CLI using Akto
      • Secure GraphQL APIs using Akto
      • Secure REST APIs using Akto
      • Secure SOAP APIs using Akto
      • Create and Edit Auth Types
  • Issues
    • Concepts
      • Overview
      • Values
      • Vulnerability Report
      • Remediation
    • How To
      • Jira Integration
      • Azure DevOps Boards Integration
      • Triage Issues
        • Review Issues Marked as False Positives
      • Export Selected Issues to Reports
      • Export Vulnerability Report
  • CI/CD
    • GitHub Actions
      • Create GitHub App
    • Jenkins
    • Azure DevOps
    • GitLab
    • Generic CI/CD
    • How To
      • Run tests in CI/CD
      • Add test to CI/CD Pipeline
      • Get API Credentials
      • Test ID from Akto test
  • Account
    • Invite User
      • Change role of a User
    • Create a New Account
    • How to Switch Between Accounts in Akto
    • Understanding Role Permissions
    • Custom roles
    • Audit Logs
    • SSO
      • Azure AD SAML
      • Okta OIDC
      • Github OIDC
      • Google Workspace SSO
      • Add Members From SSO
  • Compliance
    • Concepts
      • Overview
  • API security posture
    • Concepts
      • Overview
  • SIEM Integration
    • Splunk
  • Alerts
    • Slack Webhook
    • Microsoft Teams Webhook
    • Setup alerts for Akto test run results
  • Pricing
    • Pricing Plans
    • How To
      • Upgrade Your Plan
      • Downgrade Your Plan
      • Sync Usage Data
  • API reference
    • API reference
  • Components
    • Dashboard
    • Testing module
    • Traffic mirroring module
    • Runtime analyzer
    • Context analyzer
    • Puppeteer server
    • Other OSS
    • robots.txt
  • Troubleshooting
    • How to get logs
    • How to disable logging
    • How to update helm deployments
  • Stay Updated on New Releases
  • Support
Powered by GitBook
On this page
  • Note:
  • Get Support for your Akto setup

Was this helpful?

  1. Traffic Connector

Configure TLS on kafka

We can configure kafka which is deployed as part of the hybrid runtime setup to use TLS for all producers.

Steps:

  1. Create openssl-san.cnf file with the content below. This file configures the SAN for the certificates we will create in the next step.

[ req ]
distinguished_name = req_distinguished_name
req_extensions = v3_req
prompt = no

[ req_distinguished_name ]
CN = kafka-broker

[ v3_req ]
basicConstraints = CA:FALSE
keyUsage = digitalSignature, keyEncipherment
extendedKeyUsage = serverAuth
subjectAltName = @alt_names

[ alt_names ]
DNS.1 = akto-mini-runtime-mini-runtime.default.svc.cluster.local
  1. Create certificates stores and certificate authority. The script below will create ca-cert.pem, server.keystore.jks and server.truststore.jks.

#!/bin/bash

# Create the CA key
openssl genrsa -out ca-key.pem 4096

# Create the CA cert
openssl req -x509 -new -key ca-key.pem -out ca-cert.pem -days 365 \
  -subj "/CN=MyKafkaCA"

keytool -genkeypair -alias kafka-server \
  -keyalg RSA -keysize 2048 \
  -keystore server.keystore.jks \
  -storetype PKCS12 \
  -dname "CN=kafka-broker" \
  -validity 365 \
  -storepass password -keypass password

keytool -certreq -alias kafka-server \
  -keystore server.keystore.jks \
  -file kafka-server.csr \
  -storepass password

openssl x509 -req \
  -in kafka-server.csr \
  -CA ca-cert.pem -CAkey ca-key.pem -CAcreateserial \
  -out kafka-server-signed.crt \
  -days 365 \
  -extensions v3_req \
  -extfile openssl-san.cnf

# Import CA
keytool -keystore server.keystore.jks \
  -alias CARoot \
  -import -file ca-cert.pem \
  -storepass password -noprompt

# Import signed cert
keytool -keystore server.keystore.jks \
  -alias kafka-server \
  -import -file kafka-server-signed.crt \
  -storepass password

keytool -keystore server.truststore.jks \
  -alias CARoot \
  -import -file ca-cert.pem \
  -storepass password -noprompt
  1. Crete secret in kubernetes cluster to store these certificates.

kubectl create secret generic kafka-certs \
  --from-file=server.keystore.jks \
  --from-file=server.truststore.jks \
  --from-file=ca-cert.pem
--set mini_runtime.kafka1.useTls=true
  1. Configure producers to use TLS.

    apiVersion: apps/v1
    kind: DaemonSet
    metadata:
      name: akto-k8s
      namespace: {NAMESPACE}
      labels:
        app: akto-collector
    spec:
      selector:
        matchLabels:
          app: akto-collector
      template:
        metadata:
          labels:
            app: akto-collector
        spec:
          hostNetwork: true
          dnsPolicy: ClusterFirstWithHostNet
          containers:
          - name: mirror-api-logging
            image: aktosecurity/mirror-api-logging:k8s_agent
            env: 
              - name: AKTO_TRAFFIC_BATCH_TIME_SECS
                value: "10"
              - name: AKTO_TRAFFIC_BATCH_SIZE
                value: "100"
              - name: AKTO_INFRA_MIRRORING_MODE
                value: "gcp"
              - name: AKTO_KAFKA_BROKER_MAL
                value: "<AKTO_NLB_IP>:9093"
              - name: AKTO_MONGO_CONN
                value: "mongodb://0.0.0.0:27017"
            # additional configuration
              - name: USE_TLS
                value: "true"
              - name: TLS_CA_CERT_PATH
                value: "/app/certs/ca-cert.pem"
            volumeMounts:
              - name: kafka-certs
                mountPath: /app/certs
          volumes:
            - name: kafka-certs
              secret:
                secretName: kafka-certs

Note:

  1. You can disable hostname verification as well by adding INSECURE_SKIP_VERIFY environment variable in the traffic connector and setting its value as true.

  2. You might need to change the value of DNS.1 based on your deployment in step 4. In that case, recreate the certificates after deploying the helm chart and use them.

Get Support for your Akto setup

There are multiple ways to request support from Akto. We are 24X7 available on the following:

  1. In-app intercom support. Message us with your query on intercom in Akto dashboard and someone will reply.

  2. Contact help@akto.io for email support.

PreviousAPI Import: WSDL in AktoNextConcepts

Last updated 18 days ago

Was this helpful?

Install the helm chart for and add the following attribute at the end of the helm install command. This will configure kafka to use TLS on port 9093.

Traffic connectors which are generally deployed as daemonsets need to be configured to use TLS to send data to the kafka broker. Here is the updated configuration for the . Here, we've mounted the ca-cert.pem file on the file system for the daemonset.

Similar configuration can also be added to the traffic connector.

To customize the helm chart you may take reference from .

Join our for community support.

Contact us .

eBPF
helm-charts
discord channel
here
hybrid-saas
kubernetes connector