Skip to content
Snippets Groups Projects
Commit 1c700deb authored by Diaz de Arcaya Serrano, Josu's avatar Diaz de Arcaya Serrano, Josu
Browse files

updating main branch

parent 9872248e
Branches main
No related tags found
No related merge requests found
Showing
with 420 additions and 137 deletions
# T51 IaC Executor Manager # T51 IaC Executor Manager
Running the server Running the server with uvicorn
```bash ```bash
uvicorn main:app --reload uvicorn main:app --reload
``` ```
###### Containers Execute it directly
Containerize the IEM
```bash ```bash
docker build --build-arg API_KEY=$API_KEY -t optima-piacere-docker-dev.artifact.tecnalia.com/wp5/iem-api:y1 . ./main.py
``` ```
Similarly, docker compose can be used to build both ###### Containers
```bash
docker-compose build Containerize the IEM
```
It can also be used to push them to the registry
```bash ```bash
docker-compose push docker build --build-arg API_KEY=$API_KEY -t optima-piacere-docker-dev.artifact.tecnalia.com/wp5/iem-api:y1 .
``` ```
Run the IEM Run the dockerized IEM
```bash ```bash
docker run -p 8000:8000 optima-piacere-docker-dev.artifact.tecnalia.com/wp5/iem-api:y1 docker run -p 8000:8000 optima-piacere-docker-dev.artifact.tecnalia.com/wp5/iem-api:y1
``` ```
...@@ -41,4 +40,15 @@ Run a single test ...@@ -41,4 +40,15 @@ Run a single test
nose2 -v tests.core.test_iem.TestIem.test_deploy_destroy_openstack nose2 -v tests.core.test_iem.TestIem.test_deploy_destroy_openstack
``` ```
Run unit and integration tests
```bash
nose2 -v tests.unit
nose2 -v tests.it
```
Integration tests are prevented from being executed unless we deliberately define an environment variable
```bash
AWS=1 nose2 -v tests.it
```
...@@ -4,6 +4,7 @@ ...@@ -4,6 +4,7 @@
Introduction Introduction
************ ************
The IaC Execution Manager utilizes different technologies that can be used for the provisioning, configuration, and orchestration of the different infrastructural devices that can be found in a production deployment. This has served us to provide evidence and reasoning for the selection of the technologies that the IEM prototype is going to utilize. ..
TODO Provide a brief description of the component here. Outline its goals, functionalities, etc.;
This prototype is viable for the deployment of different IaC technologies that cover the provisioning and the configuration of the infrastructural devices required for the projects utilizing the PIACERE framework. It provides a unified interface for other components so they can interact with the IEM in a unified manner. It can also be deployed in production utilizing container-based technologies which makes this prototype viable to be operationalized in public and private cloud provides, and on premises. For this prototype, the IEM supports two well established technologies (i.e. Ansible and Terraform) that are able to provision the different infrastructural devices required by the use cases, and the configuration of each of these infrastructural devices so they can accommodate the applications to be allocated. Mention subcomponents or extra delivered tools etc., with rst references to adequate sections.
\ No newline at end of file
Feature: PIACERE Run Time
# The input of this scenario is detailed in the following
# https://git.code.tecnalia.com/piacere/private/t51-iem/iem/-/blob/y2/openapi.json#/deployments/deploy_deployments__post
# The following scenario relates to REQ81, REQ83, REQ84, REQ87
Scenario: Deploy a fresh project which comprises terraform, ansible, and docker
Given a project bundle in the relevant IaC technologies (terraform, ansible, docker-compose), the deployment id, and the required cloud credentials
When the user triggers the deployment
Then the IEM is invoked
And executes the stages of the bundle asyncronously
And the user is notified that the deployment has been accepted
# The input of this scenario is detailed in the following
# https://git.code.tecnalia.com/piacere/private/t51-iem/iem/-/blob/y2/openapi.json#/deployments/read_status_deployment_deployments__deployment_id__get
# The following scenario relates to REQ55, REQ82
Scenario: Query the status of a running project
Given the deployment id of an already existing project
When the user queries the status of the project
Then the IEM is invoked
And the user is notified of the status
# The input of this scenario is detailed in the following
# https://git.code.tecnalia.com/piacere/private/t51-iem/iem/-/blob/y2/openapi.json#/deployments/undeploy_undeploy__post
# The following scenario relates to REQ81, REQ83, REQ84, REQ85
Scenario: Undeploy a project
Given the deployment id of an already existing project and the required cloud credentials
When the user triggers the undeployment
Then the IEM is invoked
And tears down the entire deployment asyncronously
And the user is notified that the undeployment has been accepted
# The input of this scenario is detailed in the following
# https://git.code.tecnalia.com/piacere/private/t51-iem/iem/-/blob/y2/openapi.json#/deployments/read_status_deployment_deployments__deployment_id__get
# The following scenario relates to REQ55, REQ82
Scenario: Query the status of an undeployed project
Given the deployment id of an undeployed project
When the user queries the status of the project
Then the IEM is invoked
And the user is notified of the status
# The input of this scenario is detailed in the following
# https://git.code.tecnalia.com/piacere/private/t51-iem/iem/-/blob/y2/openapi.json#/deployments/deploy_deployments__post
# The following scenario relates to REQ12, REQ81, REQ83, REQ84, REQ87
Scenario: Redeploy a project
Given a project bundle in the relevant IaC technologies (terraform, ansible, docker-compose), the deployment id, and the required cloud credentials
When the user triggers the deployment
Then the IEM is invoked
And executes the stages of the bundle asyncronously
And the user is notified that the deployment has been accepted
/out
\ No newline at end of file
@startuml
title Request the Current Status of a Deployment
participant "PRC" as DESIDE
box "IaC Execution Manager" #LightBlue
participant "Rest API" as RTIEM_api #99FF99
participant Core as RTIEM_core #99FF99
participant Persistence as RTIEM_db #99FF99
end box
DESIDE -> RTIEM_api: Deployment Status Request
RTIEM_api -> RTIEM_core: Deployment Status Request
RTIEM_core -> RTIEM_db: Deployment Status Request
RTIEM_core <-- RTIEM_db: Deployment Status Response
RTIEM_api <-- RTIEM_core: Deployment Status Response
DESIDE <-- RTIEM_api: Deployment Status Response
@enduml
@startuml
title Initiate Deployment
participant "Runtime Controller (PRC)" as RTPRC
box "IaC Execution Manager" #LightBlue
participant "Rest API" as RTIEM_api #99FF99
participant Core as RTIEM_core #99FF99
participant Persistence as RTIEM_db #99FF99
participant "Executor" as executor #99FF99
end box
collections "Resource Provider" as infraresource
RTPRC -> RTIEM_api: Deployment Request
RTPRC <-- RTIEM_api: Deployment Response
RTIEM_api -> RTIEM_core: Deployment Request
RTIEM_core -> RTIEM_db: Save Deployment Started
RTIEM_core -> executor: Deployment Request
executor -> infraresource: Deploy Commands
executor -> infraresource: ...
executor -> infraresource: Deploy Commands
executor -> RTIEM_core: Deployment Response
RTIEM_core -> RTIEM_db: Save Deployment Status
@enduml
@startuml
title Initiate Undeployment
participant "Runtime Controller (PRC)" as RTPRC
box "IaC Execution Manager" #LightBlue
participant "Rest API" as RTIEM_api #99FF99
participant Core as RTIEM_core #99FF99
participant Persistence as RTIEM_db #99FF99
participant "Executor" as executor #99FF99
end box
collections "Resource Provider" as infraresource
RTPRC -> RTIEM_api: Undeployment Request
RTPRC <-- RTIEM_api: Undeployment Response
RTIEM_api -> RTIEM_core: Undeployment Request
RTIEM_core -> RTIEM_db: Save Undeployment Started
RTIEM_core -> executor: Undeployment Request
executor -> infraresource: Uneploy Commands
executor -> infraresource: ...
executor -> infraresource: Undeploy Commands
executor -> RTIEM_core: Undeployment Response
RTIEM_core -> RTIEM_db: Save Undeployment Status
@enduml
# T51 IaC Executor Manager Secuence diagrams
This folder contains the sequence diagrams developed for the T51 IEM. They have been developed using plantuml
* https://plantuml.com
These files follow a very simple text based syntax. ie
```
Bob->Alice : Hello!
```
which renders (providing plantuml is enabled in gitlab https://docs.gitlab.com/ee/administration/integration/plantuml.html) as
```plantuml
Bob->Alice : Hello!
```
we can also specify a file
```plantuml source="51-start-deployment.puml"
```
To be able to edit them and check the rendering there are several options:
* Edit and generate the file using the jar, which is not very user friendly
``` java -jar plantuml.jar sequenceDiagram.txt ```
* Use an IDE and a plugin. There are plugins available for different IDEs,i.e.
* eclipse https://plantuml.com/eclipse
* visual code https://marketplace.visualstudio.com/items?itemName=jebbs.plantuml
3.9.7 3.9.10
FROM hashicorp/terraform:1.1.4 FROM hashicorp/terraform:1.1.4
ARG API_KEY COPY requirements.txt /tmp/requirements.txt
RUN apk add py3-pip cargo g++ python3-dev file libffi-dev openssl-dev bash python3 gnupg
RUN pip3 install -r /tmp/requirements.txt
# install docker stack
RUN apk add docker docker-compose
ENV API_KEY=$API_KEY ENV API_KEY=changeme
ENV IEM_HOME=/opt/iem/ ENV IEM_HOME=/opt/iem/
ENV DOCKERIZED=true
COPY src/resources/ansible.cfg /etc/ansible/ansible.cfg COPY src/resources/ansible.cfg /etc/ansible/ansible.cfg
COPY requirements.txt /tmp/requirements.txt
COPY src ${IEM_HOME}src
COPY main.py ${IEM_HOME}main.py
RUN apk add py3-pip cargo g++ python3-dev file libffi-dev openssl-dev bash python3=3.9.13-r1 gnupg
RUN pip3 install -r /tmp/requirements.txt
# RUN adduser -h ${IEM_HOME} -S -D iem # RUN adduser -h ${IEM_HOME} -S -D iem
COPY certs/config ${IEM_HOME}.ssh/config COPY certs/config ${IEM_HOME}.ssh/config
...@@ -24,11 +23,17 @@ RUN adduser -h ${IEM_HOME} -S -D iem && \ ...@@ -24,11 +23,17 @@ RUN adduser -h ${IEM_HOME} -S -D iem && \
chmod 0600 ${IEM_HOME}.ssh/id_rsa && \ chmod 0600 ${IEM_HOME}.ssh/id_rsa && \
chmod 0644 ${IEM_HOME}.ssh/id_rsa.pub chmod 0644 ${IEM_HOME}.ssh/id_rsa.pub
USER iem USER iem
RUN ansible-galaxy collection install community.general
COPY roles.yml /tmp/roles.yml COPY roles.yml /tmp/roles.yml
RUN ansible-galaxy install -r /tmp/roles.yml RUN ansible-galaxy install -r /tmp/roles.yml
RUN mkdir -p ${IEM_HOME}db && \
mkdir -p ${IEM_HOME}deployments
COPY src ${IEM_HOME}src
COPY main.py ${IEM_HOME}main.py
COPY logging.ini ${IEM_HOME}logging.ini
ENTRYPOINT ["/usr/bin/env"] ENTRYPOINT ["/usr/bin/env"]
WORKDIR ${IEM_HOME} WORKDIR ${IEM_HOME}
CMD /usr/bin/uvicorn main:app --host 0.0.0.0 CMD /usr/bin/uvicorn main:app --host 0.0.0.0 --log-level info
EXPOSE 8000 EXPOSE 8000
Host *
StrictHostKeyChecking no
UserKnownHostsFile=/dev/null
\ No newline at end of file
-----BEGIN RSA PRIVATE KEY-----
MIIEpAIBAAKCAQEA1FrTNE42EgZr9WJNMtvpKFHYhPUJ4lzEp83EM0jYY3TyjmIe
ThMuqMLAHCk22fl4a8PttucggJ5ZWKhcJh623/y8AybJcmqZgq9a41Q609dmirf0
7frCl+6zL8Mqy2Le2BD4eRADcq11s8r8Ys6J+EBPHQgEnK9CeZLSc/WFRlVr4bOD
s0bEouDxjTAMYjYcpsCwqYgGdIXI9WWsnt3RvcEe8CaiTqoyDN8ZtgkG6MweSrTQ
js8ySHO6o25cOoF7aT9Ihhf32I+KUanNIOvk3RAw2z1FK5xkFbbqMggZqz7rJn3M
sn2dDiCQi2CWox2OYXV/jJKLC3UFuOX64fS9cwIDAQABAoIBAQCs69Tm1/Vx0ibh
aA4DJ06C1bsh8cP9v5soJgfp1xzWSGooBcA1xasOI6B6jhkrgNlNr/uIIEe4VLne
1yJKrGIwnUagrloGQMYGxDKXwYQx80p+FXRuwe7p96eUcjIL8tQSUCd1tdOI87VQ
FjBVaWiybfO+aUQQLytLgoK7iKfhb7vO+9F+ZK1iDjBDNxFuiOM5zoeWOI7boYkD
2yXIkwoBePS2rosbPLa649sVakKex2WhQdUFst4Zba2RhnWQBXUY44LvEK5TzScF
FyYphPOUSplbzzM2+fuOna91NIWmJyHmf15lj7X9kC66XFIZMlvapksB8stEpDiA
4al3IdBJAoGBAPPuM3xkr/kQYYn7E42fgpmINZ78V48gMLhpyUOabZtF8inMyMPB
q7kfHns8Vx0ET8grSNr3qwDDV82lwvGqRCFChASMdQRR9LanydkDSeqpuZyQtVlt
A/65YUdcNY7Vy+M+fRh5Srh/6qcO3beLeLWXbJ4RHBP/OEmHuF4mLfgVAoGBAN7c
qdxTOhXPvOU69Bs5rJdfo6qBI1Yg8MCGctsUwPFpw1kW773ROOPa6XI6D74Dsdg8
ypZ+IC3pRVtx61Xi3NOwxWNTTG+dyUgTSFz+WKjywXZXeHIbWngiFqk8JFYQWPzk
6YaJk4tZhk2YuNNaCCYRgQqyWv8coEurRlMXZHlnAoGBALcJwdaQ0z8oXJimL4jw
7ZX5kIrpPWanuAdZUe4Jfj+qX8mf4fKKbCowQLYmlBOw/ZDtcfDlMYsUCdnFjZ+7
rP3sJJYpM1F3khJRm3PdNOUCUMY8C+i7lejZADcE6SdyJFkztbjcowYI7nJHBHZL
ENvqcVW27wPOWlVKozz6lzn1AoGALVwmaoS6DtRwcwuzwZLUkR7TNhIAujgMKHN1
DyhDOR+4tfpYI39hH+dfmnM83wTrfsKozUawkAepqToflySMo72X/2Zl6VXpMPVT
xjGyo/h87fRRvI/asxblG9702luLcTW6XjrEQBmhn0uVWtc5T15CsIWqxb/y1FPx
BVp+hcMCgYAlJXbjzjbbDoIOCsXPSPe9voBL8zVwp0aNuvQcuB/vCt1n1c1DWuPr
AGMy/fRwY0Znag+ODMuulm7RgXUQy6ifJHiz9cKVGg/mGifaJSjgC+1AI9HFlij3
asM5CueU0gK974rDxQkwmIWpRH57+kf6s8tGDrPPvqX9S4p3oxFlTw==
-----END RSA PRIVATE KEY-----
ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDUWtM0TjYSBmv1Yk0y2+koUdiE9QniXMSnzcQzSNhjdPKOYh5OEy6owsAcKTbZ+Xhrw+225yCAnllYqFwmHrbf/LwDJslyapmCr1rjVDrT12aKt/Tt+sKX7rMvwyrLYt7YEPh5EANyrXWzyvxizon4QE8dCAScr0J5ktJz9YVGVWvhs4OzRsSi4PGNMAxiNhymwLCpiAZ0hcj1Zaye3dG9wR7wJqJOqjIM3xm2CQbozB5KtNCOzzJIc7qjblw6gXtpP0iGF/fYj4pRqc0g6+TdEDDbPUUrnGQVtuoyCBmrPusmfcyyfZ0OIJCLYJajHY5hdX+MkosLdQW45frh9L1z josu@WKM0092A
iem.db
[loggers]
keys=root,src,uvicorn
[handlers]
keys=stream_handler
[formatters]
keys=formatter
[logger_root]
level=INFO
handlers=stream_handler
[logger_src]
level=INFO
handlers=stream_handler
qualname=src
propagate=0
[logger_uvicorn]
level=INFO
handlers=stream_handler
qualname=uvicorn
propagate=0
[handler_stream_handler]
class=StreamHandler
formatter=formatter
[formatter_formatter]
format=%(asctime)s %(name)-12s %(levelname)-8s %(message)s
#!/usr/bin/env python3
import json import json
import logging import logging
import os
from fastapi import FastAPI, BackgroundTasks, status, Security, Depends, HTTPException
from fastapi.openapi.utils import get_openapi
from fastapi.security.api_key import APIKeyHeader, APIKey
from typing import List from typing import List
import uvicorn
from fastapi import (BackgroundTasks, Depends, FastAPI, HTTPException,
Security, status)
from fastapi.openapi.utils import get_openapi
from fastapi.security.api_key import APIKey, APIKeyHeader
from src import buildno, major, minor, revision
from src.core.iem import Iem from src.core.iem import Iem
from src.core.persistence import Sqlite from src.core.persistence import Persistence
from src.core.utils import ( from src.core.utils import (BaseResponse, DeleteDeploymentRequest,
BaseResponse, DeploymentRequest, DeploymentResponse,
DeploymentResponse, SelfHealingRequest, DeploymentStatusRequest)
DeploymentRequest,
DeleteDeploymentRequest,
)
LOGGER = logging.getLogger("iem")
api_key_header = APIKeyHeader(name="x-api-key", auto_error=False) api_key_header = APIKeyHeader(name="x-api-key", auto_error=False)
app = FastAPI( app = FastAPI(
title="IaC Execution Manager", version="0.1.15", description="IaC Execution Manager" title="IaC Execution Manager",
version=f"{major}.{minor}.{revision}.{buildno}",
description="IaC Execution Manager",
) )
logging.config.fileConfig("logging.ini")
async def get_api_key(api_key_query: str = Security(api_key_header)):
if Sqlite().valid_api_key(api_key_query=api_key_query): async def get_api_key(api_key_query: str = Security(api_key_header)):
if Persistence().valid_api_key(api_key_query=api_key_query):
return api_key_query return api_key_query
else: else:
raise HTTPException( raise HTTPException(
...@@ -37,22 +38,18 @@ async def get_api_key(api_key_query: str = Security(api_key_header)): ...@@ -37,22 +38,18 @@ async def get_api_key(api_key_query: str = Security(api_key_header)):
@app.get("/", tags=["greeting"]) @app.get("/", tags=["greeting"])
async def read_root(api_key: APIKey = Depends(get_api_key)): async def read_root(_: APIKey = Depends(get_api_key)):
return { return {
"message": "Hello from the IaC Execution Manager!", "message": "Hello from the IaC Execution Manager!",
"version": app.version, "version": app.version,
"terraform": "1.1.4", "terraform": "1.1.4",
"ansible": "5.5.0", "ansible": "8.5.0",
} }
@app.get("/deployments/", response_model=List[DeploymentResponse], tags=["deployments"]) @app.get("/deployments/", response_model=List[DeploymentResponse], tags=["deployments"])
async def read_status( async def read_status(
start: int = 0, _: APIKey = Depends(get_api_key),
count: int = 25,
start_date: str = "1970-01-01",
end_date: str = "2100-01-01",
api_key: APIKey = Depends(get_api_key),
): ):
all_deployments = Iem(credentials=None).get_all_deployments() all_deployments = Iem(credentials=None).get_all_deployments()
return list(all_deployments) return list(all_deployments)
...@@ -65,14 +62,29 @@ async def read_status( ...@@ -65,14 +62,29 @@ async def read_status(
) )
async def read_status_deployment( async def read_status_deployment(
deployment_id: str, deployment_id: str,
start: int = 0, _: APIKey = Depends(get_api_key),
count: int = 1,
api_key: APIKey = Depends(get_api_key),
): ):
deployment = Iem().get_deployment(deployment_id=deployment_id) deployment = Iem().get_deployment(deployment_id=deployment_id)
return list(deployment) return list(deployment)
@app.get(
"/deployments/{deployment_id}/stages/{stage_id}/outputs",
response_model=dict,
tags=["deployments"],
)
async def read_deployment_outputs(
deployment_id: str,
stage_id: str,
d: DeploymentStatusRequest,
_: APIKey = Depends(get_api_key),
):
outputs = Iem(credentials=d.credentials).get_deployment_outputs(
deployment_id=deployment_id, stage_id=stage_id
)
return outputs
@app.post( @app.post(
"/deployments/", "/deployments/",
status_code=status.HTTP_201_CREATED, status_code=status.HTTP_201_CREATED,
...@@ -82,11 +94,10 @@ async def read_status_deployment( ...@@ -82,11 +94,10 @@ async def read_status_deployment(
async def deploy( async def deploy(
d: DeploymentRequest, d: DeploymentRequest,
background_tasks: BackgroundTasks, background_tasks: BackgroundTasks,
api_key: APIKey = Depends(get_api_key), _: APIKey = Depends(get_api_key),
): ):
logging.warning(d)
i = Iem(credentials=d.credentials) i = Iem(credentials=d.credentials)
background_tasks.add_task(i.deploy, d.deployment_id, d.repository, d.commit) background_tasks.add_task(i.deploy, d.deployment_id, d.bundle.base64)
return BaseResponse(message="Deployment Request Created") return BaseResponse(message="Deployment Request Created")
...@@ -99,15 +110,48 @@ async def deploy( ...@@ -99,15 +110,48 @@ async def deploy(
async def undeploy( async def undeploy(
d: DeleteDeploymentRequest, d: DeleteDeploymentRequest,
background_tasks: BackgroundTasks, background_tasks: BackgroundTasks,
api_key: APIKey = Depends(get_api_key), _: APIKey = Depends(get_api_key),
): ):
logging.warning(d)
i = Iem(credentials=d.credentials) i = Iem(credentials=d.credentials)
background_tasks.add_task(i.destroy, d.deployment_id) background_tasks.add_task(i.destroy, d.deployment_id)
return BaseResponse(message="Undeployment Request Created") return BaseResponse(message="Undeployment Request Created")
if os.getenv("STAGE") == "dev": @app.post(
"/deployments/{deployment_id}/self-healing",
status_code=status.HTTP_201_CREATED,
response_model=BaseResponse,
tags=["deployments"],
)
async def self_healing_strategy(
deployment_id: str,
d: SelfHealingRequest,
background_tasks: BackgroundTasks,
_: APIKey = Depends(get_api_key),
):
i = Iem(credentials=d.credentials)
background_tasks.add_task(i.self_healing_strategy, deployment_id, d.playbook)
return BaseResponse(message=f"Self-Healing Strategy Request Triggered")
@app.post(
"/update-iac-bundle/",
status_code=status.HTTP_201_CREATED,
response_model=BaseResponse,
tags=["deployments"],
)
async def self_healing_bundle(
d: DeploymentRequest,
background_tasks: BackgroundTasks,
_: APIKey = Depends(get_api_key),
):
i = Iem(credentials=d.credentials)
background_tasks.add_task(i.self_healing_bundle, d.deployment_id, d.bundle.base64)
return BaseResponse(message="Bundle Replacement Created")
if __name__ == "__main__":
uvicorn.run("main:app", host="127.0.0.1", port=8000, log_level="info")
with open("../openapi.json", "w") as f: with open("../openapi.json", "w") as f:
json.dump( json.dump(
get_openapi( get_openapi(
......
fastapi==0.73.0 fastapi==0.73.0
uvicorn==0.17.0.post1 uvicorn==0.17.0.post1
ansible==5.5.0 ansible==8.5.0
ansible-core==2.12.3
GitPython==3.1.26 GitPython==3.1.26
requests==2.26.0 requests==2.26.0
ratelimiter==1.2.0.post0 ratelimiter==1.2.0.post0
......
from ._version import buildno, major, minor, revision
major = 3
minor = 0
revision = 1
buildno = 18
import logging import logging
import os import os
import subprocess import subprocess
import time
from abc import ABC, abstractmethod from abc import ABC, abstractmethod
from jinja2 import Template
from subprocess import CalledProcessError
LOGGER = logging.getLogger("iem") from jinja2 import Environment, FileSystemLoader
LOGGER = logging.getLogger(__name__)
class Factory: class Factory:
...@@ -26,6 +26,18 @@ class Engine(ABC): ...@@ -26,6 +26,18 @@ class Engine(ABC):
self._repo_path = repo_path self._repo_path = repo_path
self._env = env self._env = env
def _run_command(self, args: list) -> subprocess.CompletedProcess:
output = subprocess.run(
args=args, cwd=self._repo_path, env=self._env, capture_output=True
)
if output.returncode == 0:
LOGGER.info(output.stdout.decode("utf-8"))
LOGGER.info(output.stderr.decode("utf-8"))
else:
LOGGER.error(output.stdout.decode("utf-8"))
LOGGER.error(output.stderr.decode("utf-8"))
return output
@abstractmethod @abstractmethod
def apply(self): def apply(self):
pass pass
...@@ -43,107 +55,80 @@ class Engine(ABC): ...@@ -43,107 +55,80 @@ class Engine(ABC):
class Terraform(Engine): class Terraform(Engine):
def __init__(self, repo_path, my_env): def __init__(self, repo_path, my_env, skip_inventory=False):
super().__init__(name="Terraform", repo_path=repo_path, env=my_env) super().__init__(name="Terraform", repo_path=repo_path, env=my_env)
def apply(self): def apply(self):
LOGGER.info("About to apply terraform") LOGGER.info("About to apply terraform")
try:
output = subprocess.run( args = ["terraform", "init"]
["terraform", "init"], output = self._run_command(args=args)
check=True, if output.returncode != 0:
cwd=self._repo_path, return output.returncode, output.stdout, output.stderr
env=self._env,
capture_output=True, args = ["terraform", "apply", "-auto-approve"]
) output = self._run_command(args=args)
output = subprocess.run( return output.returncode, output.stdout, output.stderr
["terraform", "apply", "-auto-approve"],
check=True,
cwd=self._repo_path,
env=self._env,
# capture_output=True,
)
return "CREATED", output.stdout, output.stderr
except CalledProcessError as e:
LOGGER.exception(e)
return "ERROR", None, None
def destroy(self): def destroy(self):
try: args = ["terraform", "destroy", "-auto-approve"]
output = subprocess.run( output = self._run_command(args=args)
["terraform", "destroy", "-auto-approve"], return output.returncode, output.stdout, output.stderr
check=True,
cwd=self._repo_path,
env=self._env,
# capture_output=True,
)
return "DESTROYED", output.stdout, output.stderr
except CalledProcessError as e:
LOGGER.exception(e)
return "ERROR", None, None
def output(self): def output(self):
try: args = ["terraform", "output", "-json"]
output = subprocess.run( output = self._run_command(args=args)
["terraform", "output", "-json"], output.check_returncode()
check=True, return output.stdout
cwd=self._repo_path,
env=self._env,
capture_output=True,
)
return output.stdout
except CalledProcessError as e:
LOGGER.exception(e)
return None
class Ansible(Engine): class Ansible(Engine):
def __init__(self, repo_path, my_env): def __init__(self, repo_path, my_env, skip_inventory=False):
super().__init__(name="Ansible", repo_path=repo_path, env=my_env) super().__init__(name="Ansible", repo_path=repo_path, env=my_env)
self.__parse_inventory() self.__parse_inventory() if not skip_inventory else None
def __parse_inventory(self): def __parse_inventory(self):
with open(f"{self._repo_path}/inventory.j2", "r") as f: environment = Environment(loader=FileSystemLoader({self._repo_path}))
inventory = Template(f.read())
with open(f"{self._repo_path}/inventory", "w") as f: with open(f"{self._repo_path}/inventory", "w") as f:
f.write(inventory.render(self._env)) template = environment.get_template("inventory.j2")
f.write(template.render(self._env))
with open(f"{self._repo_path}/ssh_key.j2", "r") as f:
ssh_key = Template(f.read())
with open(f"{self._repo_path}/ssh_key", "w") as f: with open(f"{self._repo_path}/ssh_key", "w") as f:
f.write(ssh_key.render(self._env)) template = environment.get_template("ssh_key.j2")
f.write(template.render(self._env))
os.chmod(f"{self._repo_path}/ssh_key", 0o0600) os.chmod(f"{self._repo_path}/ssh_key", 0o0600)
def apply(self): def apply(self):
LOGGER.info("About to apply ansible") LOGGER.info("About to apply ansible")
try: for _ in range(2):
output = subprocess.run( args = [
["ansible", "all", "-i", "inventory", "-m", "wait_for_connection"], "ansible",
check=True, "all",
cwd=self._repo_path, "-i",
env=self._env, "inventory",
capture_output=True, "-m",
) "wait_for_connection",
]
output = self._run_command(args=args)
if output.returncode != 0:
time.sleep(10)
continue
LOGGER.info("All hosts in the inventory are reachable.") LOGGER.info("All hosts in the inventory are reachable.")
output = subprocess.run(
["ansible-playbook", "-i", "inventory", "main.yml"], args = ["ansible-playbook", "-i", "inventory", "main.yml"]
check=True, output = self._run_command(args=args)
cwd=self._repo_path, if output.returncode != 0:
env=self._env, time.sleep(10)
capture_output=True, continue
)
return "CREATED", output.stdout, output.stderr return output.returncode, output.stdout, output.stderr
except CalledProcessError as e:
LOGGER.exception(e.output)
raise e
def destroy(self): def destroy(self):
LOGGER.info("Nothing to be seen here.") LOGGER.info("Nothing to be seen here.")
return "DESTROYED", None, None return 0, None, None
def output(self): def output(self):
LOGGER.info("Nothing to be seen here.") LOGGER.info("Nothing to be seen here.")
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment