Today we share an automated publishing tool for Python – Fabric.

I mentioned tar, rsync, supervisor in the previous article “Linux series of pits (1)- Commonly used 3 commands”.

I’m sure careful readers will notice, but there’s one tool missing.

That is the tool that connects these three steps. It can use SSH to make it more convenient and intelligent for us to interact with the remote server by executing shell commands.

In addition to interacting with remote services, it is also very convenient for us to perform complex, customizable Python functions on a daily basis.

The installation

PIP install Fabric # PIP freeze > requirements. TXT #

An example from the official website:

def hello(name='sitin') :print("Hello world %s!" % name)
Copy the code

Use fab to do the following:

Fab is the command line tool for fabric installation, and we operate primarily through it.

I personally use a lot of commands are:

Run Remote command local Local command CD Remote directory switchover LCD Local directory switchover @ Task Decorator The fab task function is declared

I think the simple script is enough, more complex needs more operation, see more details later.

Deployment steps

In general, as Python engineers we publish code to do the following common things:

  1. Git pull pulls the latest code, such as the Master branch (or develop branch)

  2. Tar packages the latest code

  3. Rsync incrementally synchronizes to the remote server, removing some unwanted local directories

  4. Backup database or backup code

  5. Supervisor specifies the restart of one or more remote services, as determined by the interactive command

  6. Sentry Checks whether logs are normal

Everything except the last step is done in fabfile.py, which is normally placed in the root directory of your project, but you can put it anywhere else. It is specified by **-f**.

In summary, please see a practical example below

A case in point

from fabric.api import (
    with_settings,
    hosts,
    cd.# the remote
    lcd, # Change directory locally
    run, # to perform
    env,
)

EST_ENV = '127.0.0.1'TEST_USER = 'test'env.forward_agent = True@with_settings(user=TEST_USER) @with_settings(TEST_USER) # Can be equivalent to with settingsdef deploy_test():
    Release the test environment
    local('git pull --rebase upsgream dev') # local execute local command pull code to local, this can be published with CI automatically, do not need to pull local.
    local('rsync -r . --exclude=tmp/ --exclude=backup/ sitin@yourip:/data/your_project') # upload code with CD ('/data/your_project'): # indicates all operations are under this directory
        run('docker-compose pull test')  # test Specifies the image name
        backup_db() This is just a normal backup function
        run('docker-compose stop test') Run the remote command as opposed to local
        run('docker-compose rm -f test')
        run('docker-compose run --rm test python manage.py migrate') # db sync
        run('docker-compose up -d test')
Copy the code

Run commands on the terminal

Fab deploy_test # publishes the test environment fab deploy_Product # publishes if it exists

Testing, development, and server operations are often different from online operations, and can be done in this way. In addition to the above, if we are testing all or more servers on the line, we can select the server to publish by specifying different roles.

env.roledefs = {    
  'test': ['test@yourip'].# specify multiple machines
  'dev': ['dev@yourip'].'prod': ['opt@yourip2'],   
}   
def deploy(branch=master): 
    pass
Copy the code

Fab -r test (role defined above) deploy -f fabfile.py deploy also specifies which branch code to publish

In this way, it is very simple and convenient to specify the server of a certain role and release a certain branch. For the complex command operation that is often used, we can also use it as an abbreviated command to operate.

other

Docker-compose run –rm python manage.py migrate docker-compose run –rm Python manage.py Migrate

If you write fabfile.py to a function, just like Linux alias, fab Migrate is done.

In addition to the above, we can specify the release version color to distinguish successful release error or not, to warn us, very convenient and fast.

For more information, see Fab-h for more parameters. That’s it for today.

There are also unclear docker-compose commands for reference to my previous article. In addition to Fabric, there are SaltStack, Ansible’s more complex and powerful management tools for Python server operation and maintenance. We will share them later. Thank you for your attention.

A stick to the original Python public number

Related articles:

Container Deployment Practices for Django Applications (Part 2)

Deploying Django applications using Docker Container Deployment Practices (Part 1)

Linux series pit remember (1)- common 3 commands

Docker Docker swarm Docker Docker swarm

Scan code to join for free