Please attach a current resume with up-to-date contact information.
Dynamic Systems, Inc. (DSI) is a leading mechanical and process construction contractor committed to empowering our teams with high-quality data, powerful analytics, and integrated decision-making tools. We leverage Microsoft Fabric and Power BI to drive innovation across accounting, construction project management, and ERP systems.
As our need for external data integration grows, we’re seeking a dedicated Data Engineer who specializes in external ingestion, distributed processing, and scalable data pipeline architecture, ensuring our Microsoft Fabric environment remains high-performing, secure, and business-driven.
This is an in-office job in Buda, Texas.
External Data Ingestion & Pipeline Development (Primary Focus)
•Design, build, and maintain data pipelines to ingest data from APIs, SFTP, databases, and cloud sources into Microsoft Fabric’s Lakehouse and Data Warehouse environments.
•Prioritize schema management, security compliance, and robust ingestion frameworks.
•Architect ingestion and transformation pipelines using Apache Spark and PySpark.
•Build scalable systems optimized for distributed parallelism, partitioning, schema evolution, and fault-tolerance.
•Design and manage Bronze (raw), Silver (cleansed), and Gold (business-ready) data layers within Fabric.
•Ensure strong governance practices across the lakehouse architecture.
•Integrate pipelines with version-controlled CI/CD frameworks (Azure DevOps, Git).
•Build resilient pipelines with automated testing, anomaly detection, and monitoring.
•Work closely with BI Developers to align ingestion needs with semantic modeling, reporting, and downstream security requirements (RLS, CLS).
•Partner with DevOps teams to maintain high standards of data integrity and system reliability.
Dynamic Systems, Inc. is an equal opportunity employer regardless of race, creed, color, gender, age, religion, disabilities, national origin or veteran status.