How to protect API endpoints

How to protect API endpoints, using nginx, more comfortable. This way can impact some performance, because "If is Evil...", says nginx.com, but not provide normal solution :) So, any limitation can be abused...

What we have: some web service, API, partly should be available only internally, partly with or without rate limiting. Traditionally we must add lot of lines. But may be we can use another way?
map $http_x_forwarded_for $uri_acl {
  default 0;
  ~^/remote/downloads 2;
  ~^/remote/uploads 2;
  ~^/a/v1/capabilities 1;
  ~^/a/v1/users 1;
}

map $uri_acl $url_limit_key {
  0 "";
  1 $remote_addr$request_uri;
  2 "";
}

geo $remote_addr $ip_acl {
  default 0;
  172.16.1.21 1;
  172.16.2.21 1;
  172.16.3.21 1;
  172.16.4.21 1;
}

limit_req_zone $url_limit_key zone=api_limit:100m rate=3r/s;

server {
  listen 80;
  listen 443 ssl http2;

  ...

  set $ip_acl_status 0;
  if ($ip_acl = "1") { set $ip_acl_status 1; }
  set $uri_acl_status 0;
  if ($uri_acl = "1") { set $uri_acl_status 1; }
  if ($uri_acl = "2") { set $uri_acl_status 1; }
  set $acl_status "${uri_acl_status}${ip_acl_status}";

  if ( $acl_status ~ (00) ) { return 403; }

  limit_req zone=api_limit burst=5 nodelay;
  limit_req_status 429;
  limit_req_log_level warn;

  ...

  location / {
    ...
  }

}

Fin:

1. Endpoint with uri_acl=0, 200 + NoRateLimit from ip_acl and 403 + NoRateLimit from everyone
2. Endpoint with uri_acl=1, 200 + NoRateLimit from everyone
3. Endpoint with uri_acl=2, 200 + RateLimit from everyone
4. Any other combination can be implemented...

Comments

Popular posts from this blog

Redis with failover replication

FreeRadius and Google Workspace LDAP