Introduction to Continuous Integration for a Python Package 

Table of Contents

Continuous integration and deployment (CICD) is one of the most popular best practices since it produces a simpler and more controlled way to develop and deploy a web application that prevents any upgrade disruptions and offers service without downtime. Python As a result, even for code that does not need to be continuously produced and tested, there is a strong desire to extend this method to develop code in general. a Python package, for instance.

The benefits of Python with respect to Continuous Integration are huge and it is beneficial to you Python lovers.

Continuous integration (CI) nonetheless has the following benefits:

  • It gives you the reassurance that a bug is not always introduced when you make modifications.
  • It eliminates issues and conflicts while integrating your code.
  • It makes sure the code runs on systems other than the one on which it was deployed.

In other words, by making development cycles faster, CICD increases confidence in results, ensuring more stable software without conflicts.

Starting a pipeline

Even though you can create your own Tekton pipeline from the beginning with all the required steps in a toolchain, you should use a CI toolchain template offered by IBM Cloud to guarantee security compliance and code inspections.

As a result, you begin reading the documentation and sample code but discover nothing about Python. All you can discover while searching for web apps are examples and recommendations for Node web container images.

Continuous Integration

An advantage of a DevSecOps implementation is that it offers a pipeline with some pre-built activities that can be customised to meet your needs using a YAML configuration file that can add scripts to a regular run. Looking at the available pipeline skeletons, I discovered that the end artefacts are regarded as images and as a result go through steps like containerizing code and running image vulnerability checks. Simply put, in a Python case, these do not apply.

In order to determine whether this pipeline implementation may be utilised for this purpose, you need to investigate what a conventional CI pipeline for a Python package build requires.

Steps involved

In other words, the actions you need to take are:

  • Find or make a tool image that can do any operation or check that is typically used in Python programming.
  • Analyse and determine which Tekton stages are DevSecOps ready, still relevant, and worthwhile running.
  • Find more effective tools for these stages.
  • Find a replacement in the event that certain required integrations are absent or broken.

You can determine whether the effort of employing a DevSecOps pipeline is producing positive outcomes once each of these activities has been done. In either case, you outline what must be done to establish a functional CI pipeline.

Python image

Pipelines are run utilising containerized images, which are frequently made available to the public and contain the tools needed to finish their duties. Unfortunately, even though they can be gathered and loaded using a script that customises each pipeline stage, Python tools are frequently not included in these standard public images.

You can enter the aforementioned script as instructions to load tools and finish the Python stack in a file called.pipeline-config.yaml that has sections for each stage, one for each level.

Introduction to Continuous Integration for a Python Package 

However, once the pipeline is operational, this process could be drawn out and tedious. Additionally, this method may have numerous repeated instructions that are challenging to keep aligned even when only minor modifications are made, especially if you choose to employ multiple stages and more than one pipeline.

Building a new, lightweight image and running it to complete each stage is a superior method. This image hosts all required tools and already embeds Python’s stack.

It is important to create a personal image to carry out the pipeline step that uses a private worker. In DevSecOps pipelines, load container images from private registries.

You can change any execution stage through the YAML configuration file if this administration is too laborious. The IBM Cloud product documentation goes into great depth about how to use YAML configuration files.

Stages to be eliminated: scripts or forked repositories

DevSecOps CI pipelines define and provide a number of stages to finish a code integration, as already explained.

Some of these steps are entirely useless to authors of Python packages because they are designed for a web application that integrates microservices in a Kubernetes environment. Since the Python package process typically does not produce container images, the stage when a resulting Docker image is examined for vulnerabilities is one such example. You can learn more by joining an online Python training.

There are two distinct ways to achieve this pipeline redesign:

  • Use phantom stages that are empty and only have echoes.
  • Eliminate or rework several pipeline steps in the Tekton instructions for editing definitions in a forked repository.

Even if the second method runs the pipeline more effectively, the administrator is still required to continually check the original repository on a regular basis to see if any definitions have been updated with improvements and corrections that have been submitted into a specially forked version.

This is a tedious procedure that occasionally may lead to mistakes. When compared to the maintained automatic refresh of the tekton procedures that the use of standard configuration provides, the time saved by avoiding the loading of a container image utilised only by echoing a variable is not beneficial if specific needs are not present.

In any case, a quick test to determine the time disparities between these two distinct options can be helpful to make the last decision. Even though, since the worker system has cached the container image utilised by the pipeline, there shouldn’t be any noticeable timing disparities.

Artifactory Integration

JFrog DevSecOps pipelines can add and configure Artifactory as a tool integration, but sadly, one for Python Package Index (PyPI) is not among the supported repository types. This confirms my preference to only use Artifactory integration to acquire Artifactory credentials in order to access the PyPI repository by configuring pip appropriately.

By adding environment property to the ones available inside each CI pipeline, it is feasible to retrieve Artifactory credentials. Enter the pipeline, choose environmental properties from the menu that appears, and add a new one with the designation “integration tool” to it.

Conclusion

Even though a Python package’s code is being built more gradually and it does not need to continuously maintain a functioning service, you should always aim for the highest level of quality. Therefore, all of the tests are crucial to creating high-quality software and can aid in your ability to provide packages free of errors.
A pipeline compels all developers engaged to pay attention to their communication, maintain a clear commit history, and have a consistent understanding of the software’s architecture and design. Additionally, the emphasis on various tests, including unit, systems, and acceptance tests, can direct development to rapidly identify which ones are intended features and enhancements. As a result, it is easier to implement agile methodologies and prevents the development of out-of-scope routines. The future of Python is safe with Continuous Integration.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Share this article
Subscribe
By pressing the Subscribe button, you confirm that you have read our Privacy Policy.
Need a Free Demo Class?
Join H2K Infosys IT Online Training
Enroll Free demo class