Keyera Career Opportunity Why Keyera?
It’s our purpose-driven culture, benefits, and people. From flexible, customizable benefits and an employer-funded pension plan to our Keyera Connects social investment program and paid employee volunteer days, we’re focused on empowering our people and communities.
When You Work With Us, You'll Enjoy Flexible Benefits
to meet your individual and family needs, including
- $3,500 plus 4.5% of base salary each year to customize your benefits and investments
- Saving plan options, including an Employee Share Purchase Plan, RRSP, and TFSA
- Defined Contribution Pension Plan funded by Keyera up to 10% of base salary
- Wellness Personal Spending Account of $750 per year to cover wellness expenses
Paid vacation and eight flex days each year
- Two paid volunteer days each year to support the causes that are most important to you
- Employee Family Assistance Program with a variety of support resources from professional counselling to financial planning support and more.
- Keyera’s Northern Allowance is a tiered incentive program based on years of service, currently offered to employees working at the following locations: Grande Prairie Office, Wapiti, Simonette, Gold Creek, South Cheecham, and Fox Creek Terminal. This information reflects the current state of the program and may be subject to change over time.
Please note that compensation and benefits may be different based on the work location, position, and a candidate's experience and qualifications.
Job Type
Permanent
THE POSITION
The Data Platform Specialist is responsible for designing, developing, and maintaining the infrastructure and systems required for data storage, processing, integration, and analysis. They play a crucial role in building and managing scalable data pipelines and positioning the enterprise data platform as a secure, governed integration hub that enables reliable data exchange across internal systems, external applications, and analytical environments.
The role also carries accountability for performance optimization and cost management of the data platform, ensuring efficient use of consumption-based services and alignment to business value.
Responsibilities
Data Management & Pipeline Development
- Designs and develops data pipelines that extract data from various sources, transform it into the desired format, and load it into appropriate data storage systems
- Transforms raw data into usable formats through cleansing, aggregation, filtering, enrichment, and modeling techniques
- Creates robust data models and architectures to support analytics initiatives
- Optimizes data pipelines and processing workflows for performance, scalability, reliability, and efficiency
- Designs, develops and maintains data pipelines to ensure efficient and reliable ETL processes
- Monitors and tunes data systems, identifies and resolves performance bottlenecks, and implements optimization strategies including partitioning, caching, and indexing
- Implements and maintains analytics systems, data warehouses, or data lakes to store and manage structured and unstructured data
- Collaborates with data scientists and AI developers to integrate predictive models or machine learning algorithms into analytics workflows
- Ensures data quality and accuracy by implementing data validation, monitoring, and error-handling processes
- Designs and implements the data platform as an enterprise integration hub, supporting both inbound and outbound data flows
- Develops and manages integrations with internal systems, SaaS platforms, operational applications, APIs, event streams, and external partners
- Enables bidirectional data exchange, including batch, streaming, API-based, and event-driven integration patterns
- Ensures data consistency, integrity, security, and compliance across integrated systems
- Works with application, architecture, and integration teams to define integration standards, reusable patterns, and scalable data exchange frameworks
- Implements data quality checks and validation controls within data pipelines to ensure accuracy, consistency, and completeness
- Collaborates with data scientists and analysts to optimize data models for quality, security, performance, and governance
- Contributes to the establishment and enforcement of governance practices for data assets and analytical models
- Supports enterprise data modeling practices, including logical, physical, and semantic modeling where applicable
- Models, monitors, and manages platform costs across cloud and data services based on consumption and pricing models (e.g., compute, storage, data movement, API usage
- Designs cost-efficient architectures aligned to workload characteristics and service pricing structures
- Implements workload management, scaling policies, and resource optimization strategies to balance performance and cost
- Provides transparency and reporting on platform usage and cost drivers to support financial planning and accountability
- Partners with architecture and finance stakeholders to forecast platform growth and optimize total cost of ownership
- Collaborates across AI, analytics, architecture, and business teams to enable trusted and accessible data products
- Supports executives and senior leaders in understanding the value, risk, and economics of enterprise data assets
- Translates between executive, business, IT, and quantitative stakeholders to align technical solutions with business outcomes
Education
- Bachelor’s degree in Computer Science, Data Science, Software Engineering, Information Systems, or related background and experience
- Understanding and experience working within a Databricks environment
- At least 15 years of work experience in data management or related disciplines, including data integration, modeling, optimization, and data quality
- Proven experience designing and maintaining modern data platforms in cloud-based big data environments (e.g., Databricks, Snowflake)
- Experience building API-based and event-driven integrations across enterprise systems
- Demonstrated experience building, scaling and sustaining enterprise environments
- Experience operating within consumption-based cloud pricing models and managing platform cost optimization initiatives
- Experience working in or supporting asset-intensive industries such as:
- Energy, utilities, mining, manufacturing, transportation, or infrastructure
- Capital projects, operations, reliability, maintenance, or supply chain environments
- Expert in designing and building scalable ELT pipelines within the Databricks Lakehouse, or directly relevant experience, leveraging distributed processing and native platform capabilities, while integrating with complementary enterprise data and operational solutions.
- Strong proficiency in programming languages such as Python and Java
- Advanced SQL skills and experience with modern data warehouse platforms (e.g., Snowflake, Databricks)
- Experience with cloud platforms (AWS, Azure) and modern data architecture patterns
- Experience with database technologies such as SQL and NoSQL systems
- Ability to design, build, and deploy data solutions that support AI, ML, BI, and operational use cases
- Strong understanding of integration architectures, API design, event streaming, and data exchange protocols
- Demonstrated ability to optimize platform performance and cost in consumption-based environments
- Strong problem-solving and debugging skills across distributed systems
- Excellent business acumen and interpersonal skills; able to influence and effect change across business lines
- Ability to clearly articulate business use cases, data management concepts, architectural tradeoffs, and financial implications of technical decisions
Posting Expiry Date
Mar 27, 2026
At Keyera, we embrace collaboration, inclusion, and a workplace that is as diverse as the communities we serve. Our values foster an environment for every person to bring their whole self to work.
We offer a well-rounded total compensation package and a comprehensive benefits program designed with the well-being and empowerment of our employees and their families in mind.
If you are interested in an opportunity to join a winning, purpose-driven culture then you’ll enjoy a career with us.
We thank all applicants for their interest; however, only those considered for an interview will be contacted.