Pipelines that ship revenue, not just reports.
I'm Abhinav — 4+ years architecting ETL/ELT on Snowflake, Azure, Palantir Foundry, and Airflow. Currently moving enterprise Martech datasets across the stack at AT&T, making every campaign measurably faster and more accurate.
A pipeline I've been running for 4 years.
duration: 4.3yr · retries: 0
I treat pipelines like products — with SLAs, observability, and a changelog. Day-to-day I'm moving enterprise Martech datasets between Foundry, Azure, and Snowflake, and writing the Airflow DAGs that feed every downstream campaign.
What I care about: query plans that don't lie, schemas that age well, and 3am pages that never happen. Previously I hand-translated a 40-year-old UniBasic lease-accounting system onto modern SQL — so I have patience for legacy too.
Stack, mapped.
rows=5 · cols=5
Where the work happens.
Data Engineer & Snowflake Developer
- Architected and optimized ETL pipelines for AT&T's Abandon-Cart campaign — +3.6% YoY conversion lift, 100% record accuracy.
- Led end-to-end migration of Martech datasets from Palantir Foundry → Snowflake (Parquet extracts, Azure Blob staging, SnowSQL ingest) — +40% pipeline perf.
- Built and managed Apache Airflow DAGs to orchestrate ETL jobs; automated S3 up/downloads for vendor consumption.
- Shipped data engineering for the iPhone Early Access launch; maintained Tableau dashboards for pipeline health.
- Designed Infoworks pipelines for cross-platform replication; authored ER diagrams and solution docs at 100% client compliance.
SQL Server Developer
- Migrated the InfoLease lease-accounting app from legacy UNIDATA/PICK to a modern IDS Web Application — reverse-engineering UniBasic into optimized SQL.
- Designed & published business reports in Pentaho Report Designer; unit-tested with Beyond Compare to guarantee post-migration data integrity.
Foundational tables.
B.E. Computer Science
Four years of DBMS, systems & algorithms — the primary keys on every query plan I read today.