Open to Data & AI Engineering Opportunities

Data & Agentic Engineer focused on high-performance ETL, data platforms, and autonomous AI workflows.

I design and optimize data systems that scale. Today I build production ETL pipelines at NTT DATA and expand into agentic engineering with reliable tool-calling workflows.

3+ years of professional experience in Data Engineering (since June 2022)

Current Role
Data Engineer at NTT DATA Europe & LATAM
Location
Based in Spain

Current Focus

Engineering Domains I Build In

My work combines data pipeline reliability with agentic workflow experimentation and production delivery practices.

Data Engineering

Resilient ETL design, orchestration, and performance optimization across legacy and modern stacks.

  • Batch and near-real-time pipelines
  • Airflow orchestration and migration
  • PySpark and distributed data processing

Agentic Engineering

Production-ready software delivery using agentic engineering tools (Claude Code, OpenCode, Codex) with structured workflows, reusable skills, and quality hooks.

  • Tool-assisted implementation loops with explicit planning, execution, and review gates
  • Reusable skills and command hooks to keep outputs consistent and maintainable
  • Verification-first delivery with tests, build checks, and safe iteration patterns

Platform & Delivery

Cloud-native delivery practices that keep teams shipping faster without sacrificing observability.

  • Monitoring and operational guardrails
  • CI/CD discipline and release confidence
  • Documentation and maintainable standards

Recent Development

Featured Agentic Project

This is the latest product I launched, focused on clarity, trust, and practical user value.

Featured Case Study

Mi Calculadora Financiera

Production financial assistant experience built with agentic engineering workflows

A live financial calculator platform delivered in one month using an agentic engineering workflow. Development combined human direction with Claude Code, OpenCode, and Codex to accelerate delivery while keeping production standards.

Challenge

Build a trustworthy financial UX that explains calculations clearly, supports fast iteration, and remains maintainable as new calculators and assistant flows are added.

Solution

Defined scoped implementation loops, used tool-assisted coding sessions for feature delivery, and enforced quality hooks for validation before release. The workflow prioritized explicit tasks, review checkpoints, and reproducible changes.

  • Shipped a public, production-ready product from concept to launch in one month.
  • Improved delivery speed through agentic coding loops without sacrificing quality control.
  • Established a reusable workflow pattern (skills, hooks, and validation steps) for future products.
JavaScript HTML/CSS Claude Code OpenCode Codex Agentic workflow skills and hooks SEO and content strategy

Experience

Roles where I improved data reliability, orchestration quality, and delivery speed.

  1. Data Engineer

    NTT DATA Europe & LATAM

    Designing and maintaining data pipelines across public-sector projects with strong reliability and delivery expectations.

    • Developed ETL pipelines using PySpark and HBase for environmental programs in Andalusia.
    • Migrated legacy Pentaho Data Integration pipelines to Apache Airflow for Tenerife initiatives.
    • Contributed to maintainability and operational quality across multi-project data workflows.
  2. Business Intelligence Developer

    cdmon

    Delivered internal analytics and automation capabilities for business and engineering teams.

    • Built and maintained ETL pipelines with Apache Airflow.
    • Created insight dashboards and reporting flows using Apache Superset.
    • Developed internal APIs and improved legacy code reliability.

Selected Projects

Additional technical work across web engineering and applied optimization.

Personal Web Portfolio screenshot

Personal Web Portfolio

Portfolio website built with Astro and Tailwind CSS as a lightweight, performant front-end foundation.

Astro Tailwind CSS
PC Component Optimization Under Budget Constraints screenshot

PC Component Optimization Under Budget Constraints

Data science project that combines scraped pricing data, preprocessing, and a genetic algorithm to optimize computer component selection with constraints.

Python Web Scraping Genetic Algorithms Streamlit

About

I am an adaptable and pragmatic engineer who enjoys combining analytical rigor with creative problem-solving.

My current focus is building robust data systems and extending that mindset into agentic engineering patterns that are production-ready.

Outside work, I keep learning through programming projects, chess, and challenges that push me beyond my comfort zone.

Portrait of Jesús Muñoz González

Let's collaborate

Building reliable data and AI workflows with measurable impact.

If you are hiring for Data Engineering or Agentic Engineering roles, I can help accelerate delivery while keeping architecture and operations robust.