Accessing Postgres server from a secondary docker container without db.py

34 views
Skip to first unread message

Paul Ellis

unread,
Dec 17, 2019, 3:38:09 AM12/17/19
to web2py-users
How do I do connect to a prostgres server using PyDAL but without a db.py in the secondary container or having all the tables drop when I connect without a db definition?

I have a server running docker with containers for the nginx proxy, lets encrypt, web2py and postgres.

one docker_compose.yml brings up nginx and lets encrypt and makes a docker network.
version: '3'

services
:
  nginx
:
    image
: nginx:1.13.1
    container_name
: nginx-proxy
    ports
:
     
- "80:80"
     
- "443:443"
     
- "8080:8080"
    volumes
:
     
- conf:/etc/nginx/conf.d
     
- vhost:/etc/nginx/vhost.d
     
- html:/usr/share/nginx/html
     
- certs:/etc/nginx/certs
    labels
:
     
- "com.github.jrcs.letsencrypt_nginx_proxy_companion.nginx_proxy=true"

  dockergen
:
    image
: jwilder/docker-gen:0.7.3
    container_name
: nginx-proxy-gen
    depends_on
:
     
- nginx
    command
: -notify-sighup nginx-proxy -watch -wait 5s:30s /etc/docker-gen/templates/nginx.tmpl /etc/nginx/conf.d/default.conf
    volumes
:
     
- conf:/etc/nginx/conf.d
     
- vhost:/etc/nginx/vhost.d
     
- html:/usr/share/nginx/html
     
- certs:/etc/nginx/certs
     
- /var/run/docker.sock:/tmp/docker.sock:ro
     
- ./nginx.tmpl:/etc/docker-gen/templates/nginx.tmpl:ro

  letsencrypt
:
    image
: jrcs/letsencrypt-nginx-proxy-companion
    container_name
: nginx-proxy-le
    depends_on
:
     
- nginx
     
- dockergen
    environment
:
      NGINX_PROXY_CONTAINER
: nginx-proxy
      NGINX_DOCKER_GEN_CONTAINER
: nginx-proxy-gen
    volumes
:
     
- conf:/etc/nginx/conf.d
     
- vhost:/etc/nginx/vhost.d
     
- html:/usr/share/nginx/html
     
- certs:/etc/nginx/certs
     
- /var/run/docker.sock:/var/run/docker.sock:ro

volumes
:
  conf
:
  vhost
:
  html
:
  certs
:

# Do not forget to 'docker network create nginx-proxy' before launch, and to add '--network nginx-proxy' to proxied containers.

networks
:
 
default:
    external
:
      name
: nginx-proxy


the 'web2py' docker_compose.yml brings up the db and web2py container and connects to the nginx network.
version: '3'

services
:

  db
:
    image
: postgres:11
    restart
: always
    env_file
:
       
./config/db/db_env
    expose
:
       
- 5432
    volumes
:
     
- db_volume:/var/lib/postgresql/data
     
  angebotstool
:
    build
: .
    volumes
:
     
- web2py_apps:/home/act/web2py/applications
    expose
:
     
- 80
     
- 443

    environment
:
      VIRTUAL_HOST
: foo.barr.com
      LETSENCRYPT_HOST
: foo.barr.com
      LETSENCRYPT_EMAIL
: foo@bar.com
    depends_on
:
     
- db

     
volumes
:
    db_volume
:
        driver
: local
        driver_opts
:
            type
: 'none'
            o
: 'bind'
            device
: '/home/act/databases/'
    web2py_apps
:
        driver
: local
        driver_opts
:
            type
: 'none'
            o
: 'bind'
            device
: '/home/act/web2py-gunicorn/web2py/applications'
       
networks
:
   
default:
        external
:
            name
: nginx-proxy


This was mostly configured with luck and trial and error. I am still learning in this area.

Now I would like to have a container running Scrapy or similar which will periodically log into a site and update the product information (when required).

I want to use PyDAL to connect to the existing database. The Scrapy container will be configured in the 'web2py' docker_compose.yml.

How do I do this without having to maintain another db.py in the secondary container or having all the tables drop when I connect without a db definition?
Reply all
Reply to author
Forward
0 new messages