Company Information:
LINQX is an industry-leading provider of end-to-end digital solutions and analytics for the oil and gas sector, leveraging advanced data analytics to optimize engineering and operations . Its cloud-based platforms deliver AI-driven insights that enhance reservoir performance and streamline workflows . Driven by a mission of data-driven innovation , LINQX combines cutting-edge technology with oilfield expertise to deliver transformative results .
Website: https://linqx.io/ NOTE: THIS POSITION DOES NOT OFFER VISA SPONSOSHIP. Position Summary:
The Senior Data Engineer designs, builds, and operates cloud-scale data platforms and pipelines that power AI, analytics, ML, and data products across the business. This role partners with Product, Architecture, and Data Scientist to deliver secure, reliable, and cost-efficient batch and streaming data services on AWS/Azure. The engineer implements aligned with technical strategy for data modeling, orchestration, storage, and compute layers. The senior data engineer will also work directly with developers and architects from other product lines to enable integration and data strategies across products, mentoring and collaborating with teams on best practices in data engineering, reliability, and governance.
Key Roles / Responsibilities
- Lead the end-to-end design and implementation of enterprise data platforms (e.g., lakehouse, streaming, and “data hub” patterns); coordinate dependencies and integrations with product and enterprise systems.
- Provide technical leadership to internal and partner teams; plan scope, estimate effort, manage risks, and uphold delivery commitments.
- Coach engineers on data engineering standards, observability, and operational excellence; conduct design and code reviews.
- Define and evolve cloud data architecture (ingest, storage, compute, catalog, access) for both real-time and batch use cases; select services and patterns aligned to scalability, performance, and security requirements.
- Build robust data pipelines (ELT/ETL) using orchestration frameworks (e.g., Airflow, Dagster), event/stream platforms (e.g., Kafka/Kinesis/Event Hub/IoT Hub), and distributed processing (e.g., Spark/Databricks/Beam/Flink).
- Implement lakehouse/lake patterns with medallion layering, CDC, and schema evolution; operationalize quality gates, testing, and lineage.
- Establish SLOs for data freshness, availability, and quality; implement observability (metrics, logs, lineage) and on-call runbooks.
- Optimize cost/performance of storage and compute; manage capacity, autoscaling, and usage governance for data solutions
- Ensure privacy, security, and compliance (IAM, tokenization, row/column security, data retention); collaborate with enterprise security.
- Translate product and stakeholder requirements into data contracts and SLAs; document architecture and operational procedures.
- Evaluate emerging data technologies; create POVs and reference implementations.
- Anticipate future needs (multi-cloud, edge/IoT ingestion, vector/AI workloads) and create roadmaps and platform capabilities.
- Champion modernizing existng data architectures using current best practices: lakehouse, CDC, medallion layering, data mesh principles, and streaming.
- Strong software engineering in Python/SQL and one of Scala/Java; familiarity with REST/gRPC and microservices patterns.
- Expert with cloud data services (e.g., Azure Data Lake/Databricks/Synapse; or AWS S3/EMR/Glue/Redshift; or GCP BigQuery/Dataflow).
- Hands-on with distributed stores (e.g., Delta/Iceberg/Hudi; Kafka; Redis) and both SQL/NoSQL databases (e.g., Postgres, Cosmos/ DynamoDB, MongoDB, Cassandra).
- CI/CD (GitHub Actions/Azure DevOps), containerization (Docker/Kubernetes), Infrastructure as Code (Terraform).
- Data governance and security (catalogs, lineage, RBAC/ABAC, encryption, key management).
- Exposure to or experience implementing real time systems, with IoT/hardware design preferred, within oil and gas industry a huge plus
- Excellent stakeholder communication; ability to translate business needs into scalable data solutions.
- Bachelor’s in computer science, Engineering, or related field (master’s preferred).
- 5–10+ years in data engineering or platform roles delivering enterprise-grade data systems (SaaS or large-scale internal platforms). - 2+ years leading or building technical design/delivery for cloud data platforms at scale.- Proven delivery of streaming and batch pipelines, lakehouse implementations, and production ML data flows.
- Experience operating data platforms with SLAs/SLOs, cost governance, and on-call ownership.
- Prolonged periods sitting at a desk and working on a computer.
- Must be able to lift 15 pounds at times.
- Positions self to install equipment, including under desks.
- Moves throughout the building to access files.
- Must be able to comprehend and follow written and oral instructions.
- Must be able to complete tasks even with frequent interruptions.
- Must be able to use discretion and independent judgment as needed.
- Must be able to speak clearly on the phone and to fellow workers.
This job description should not be interpreted as an exhaustive list of responsibilities or as an employment agreement between the employer and the employee. The above statements are intended to describe the general nature and level of work being performed by employees assigned to this classification and are subject to change as the needs of the employer and requirements of the job change. Any essential functions of this position will be evaluated as necessary should an employee/applicant be unable to perform the functions or requirements due to a disability as defined by the Americans with Disabilities Act (ADA). Reasonable accommodation for the specific disability will be made for the employee/applicant when possible.
I acknowledge that I have read and understand the description of this position and have had the opportunity to ask my supervisor about any points I did not understand. I hereby state that I can perform the essential functions of this position with or without reasonable accommodation.