Python Developer — Data Engineering & AI Automation (Colombia)

Python Developer — Data Engineering & AI Automation (Colombia)

28 sep
|
R. Phants & Associates
|
Colombia

28 sep

R. Phants & Associates

Colombia

Python Developer — Data Engineering & AI Automation

TL;DR: Build production-grade data pipelines and real‑time voice automation.

You’ll ship Python services on GCP (Big

Query, Dataflow, Pub/Sub, Cloud Functions/Composer/Vertex), integrate Twilio/Asterisk/Vicidial, and operationalize LLM + TTS/STT workflows with strong data/infra reliability.

Overview

Our products power AI-driven communications (including ringless voicemail automation) at scale.

Your work will turn messy call events, transcripts, and analytics into reliable, low‑latency systems that drive customer outcomes and revenue.

What you’ll do (Outcomes)

- Ship reliable data pipelines in GCP (Big

Query, Dataflow, Pub/Sub, Cloud Functions, Composer)



with SLOs for latency, throughput, and cost per GB processed.

- Build Python backends/APIs (FastAPI/Flask), codify contracts with Pydantic, manage persistence via SQLAlchemy + Alembic, and keep schemas & migrations healthy across environments.

- Own VoIP integrations: call routing, voicemail drops, campaign automations with Twilio (Voice, Messaging, Studio, Conversations, Task

Router); customize/scale Asterisk/Vicidial for outbound dialing.

- Operationalize voice AI: integrate TTS/STT (Cartesia, Eleven

Labs, Google, AWS Polly), wire real‑time STT → intent → TTS loops, and measure WER/latency.

- Enable LLM workflows: build dataset prep, evaluation, and inference pipelines; support Vertex AI jobs, registries, CI/CD, monitoring, and rollbacks.

- Make systems observable: metrics, traces, and logs (e.g., Open

Telemetry, Cloud Monitoring) with dashboards and actionable alerts.

- Collaborate with data/ML/Dev

Ops/product on architecture, reviews, and production readiness.

Tech you’ll touch

- Python, FastAPI/Flask, Pydantic, SQLAlchemy,



Alembic, PostgreSQL/MySQL

- GCP: Big

Query, Dataflow, Pub/Sub, Cloud Functions, Cloud Run, Composer (Airflow), Vertex AI, Cloud Storage

- VoIP: Twilio (Voice/Messaging/Conversations/Studio/Task

Router), Asterisk/Vicidial, SIP/RTP

- Voice/LLM: Cartesia, Eleven

Labs, Google TTS/STT, AWS Polly, Whisper

- Platform/Tooling: Docker, Terraform, Git

Hub Actions, pytest, Ruff/mypy, Open

Telemetry, Prometheus/Grafana/Cloud Monitoring

Must‑have skills

- Python expertise for APIs, services, and data pipelines.

- GCP experience (Big

Query, Dataflow, Pub/Sub, Storage, Composer, Vertex AI).

- Pydantic for validation; SQLAlchemy for ORM; Alembic for migrations.

- Strong SQL and relational schema design.

- Practical VoIP/telephony knowledge (SIP/RTP, call flows) and Twilio APIs.





Nice‑to‑have skills

- Cartesia/Eleven

Labs and other TTS/STT providers.

- Asterisk/Vicidial customization & scaling (AMI/ARI).

- LLM integration (GPT/PaLM/OSS), prompt/latency optimization in real‑time paths.

- MLOps (Vertex pipelines, model registry, evaluation/monitoring).

- Event streaming (Kafka, Pub/Sub).

- Security/Compliance awareness for voice/data (e.g., PII redaction, call recording governance; STIR/SHAKEN/TCPA considerations where relevant).

How we work

- Remote/hybrid flexibility; async‑first with clear ownership and fast feedback loops.

- Production‑minded: testing, canaries, rollbacks, and on‑call rotation shared across the team.

- Ship small, measure impact, iterate.

Success looks like (30/60/90)

- 30 days:



Production access + dashboards; ship a small service or pipeline; add unit/integration tests around a Twilio webhook or Pub/Sub consumer.

- 60 days: Replace a fragile ETL with Dataflow; cut pipeline cost/latency by ≥25%; harden Alembic migrations with blue/green strategy.

- 90 days: Launch a real‑time STT → LLM → TTS microservice with SLOs (

Compensation & benefits

- Competitive salary + performance incentives.

- Remote/hybrid, learning budget (GCP certs, ML/telephony), modern tooling, and clear growth paths.

Hiring process (what to expect)

- Intro call (role fit, past systems you’ve owned).

- Technical deep‑dive (data pipeline & telephony scenarios).

- Practical exercise (see below) or code walkthrough.

- Panel with ML/Dev

Ops/product on cross‑functional design.

- Offer.

Practical exercise

- Build a small FastAPI service that:





- Validates an inbound Twilio Voice webhook with Pydantic,

- Publishes events to Pub/Sub,

- Streams to Big

Query via Dataflow (template or Python SDK),

- Manages tables/migrations with Alembic,

- Includes pytest tests and basic Open

Telemetry traces.

- Bonus: integrate a TTS/STT provider and measure E2E latency.

Equal opportunity

We welcome applicants of all backgrounds and identities.

If you’re excited by the role but don’t meet 100% of the bullets, please apply—skills grow quickly in our environment.

Seniority level: Entry level

Employment type: Full-time

Job function: Engineering and Information Technology

Industries: Technology, Information and Internet

Note:



This refined description excludes unrelated postings appearing at the end of the original text.

#J-18808-Ljbffr

📌 Python Developer — Data Engineering & AI Automation
🏢 R. Phants & Associates
📍 Colombia

El anuncio original lo puedes encontrar en Kit Empleo:
kitempleo.com.co/empleo/61169935

Suscribete a esta alerta:
Escribe tu dirección de correo electrónico, te permitirá de estar al tanto de los últimos empleos por: python developer — data engineering & ai automation (colombia)

Postulate a este anuncio

Muestra tus habilidades a la empresa, rellenar el formulario y deja un toque personal en la carta, ayudará el reclutador en la elección del candidato.

Suscribete a esta alerta:
Escribe tu dirección de correo electrónico, te permitirá de estar al tanto de los últimos empleos por: python developer — data engineering & ai automation (colombia)