Who You'll Be Joining
The Business Intelligence team designs, maintains and improves the analytics infrastructure that informs decisions across the company. Analytics engineering is a core and growing investment, and this role sits at the center of that work.
The Staff Analytics Engineer is a deeply technical individual contributor who owns the transformation layer of the data platform, treats analytical data as a product, and drives how data is modeled, certified, and delivered. You define the standards, reusable patterns, and decision frameworks for the lakehouse silver and gold layers, and raise analytics engineering practice through mentorship and technical leadership
This role follows a hybrid schedule, with in-office work required on Wednesdays and Thursdays from our Toronto office to support collaboration, and flexibility to work remotely for the remainder of the week. How You’ll Make an Impact: Transformation Layer Ownership
- Own and evolve the Silver and Gold layers of the lakehouse using dbt
- Apply dbt best practices across project structure, modular design, dimensional modeling, testing, and documentation standards
- Publish curated, certified datasets (conformed dimensions, core metrics) that are reusable and trusted across teams
- Design and maintain the semantic/metrics layer for standardized and consistent self-service analytics
- Design transformation workflows and contribute to end-to-end data pipelines
- Partner with data engineering to align ingestion, schema design, and architecture with analytics needs
- Monitor and optimize pipelines for performance, reliability, and cost efficiency
- Data Quality, Governance, and Reliability
- Implement robust data quality frameworks, testing, and observability (e.g., freshness, schema drift, accuracy, anomalies)
- Define and maintain data contracts between producers and consumers (e.g., schema, SLOs, quality expectations)
- Drive schema change management (versioning, backward compatibility, stakeholder communication)
- Design and enforce classification and certification standards so consumers can discover and trust analytical assets
- Ensure production reliability through SLO/SLI measurement, actionable alerting, and continuous improvement
- Own production operations for critical data flows (incident response, runbooks, root cause analysis) against defined SLOs
- Embed privacy and security controls in data models and pipelines (PII handling, masking, access controls, and compliance)
- Drive alignment across Analytics, Product, Engineering, and business stakeholders on analytics data models and platform decisions
- Surface technical constraints and trade-offs to inform delivery planning and long-term platform scalability
- Translate business requirements into technical specs and deliver consumption-ready curated datasets and semantic models for downstream reporting
- Maintain durable documentation for pipelines, models, architecture, and operating processes
- Define and maintain analytics engineering standards for dbt development (modelling conventions, testing requirements, documentation, review expectations)
- Hold code quality to a high bar through structured reviews focused on readability, modularity, testability, and standards compliance
- Provide technical mentorship that elevates craft and quality across analytics engineering and analytical work
- Evaluate and advocate for tools, patterns, and practices that improve platform reliability, scalability, and maintainability
- Deep hands-on DBT expertise: modular, tested, well-documented models; project structure at scale
- Strong data modelling skills (dimensional modelling, star/snowflake) and warehouse optimization
- Proficiency in Python and SQL for building and troubleshooting transformation code and pipelines
- Experience owning a semantic or metrics layer as a shared product that standardizes business logic across tools and teams
- Hands-on experience with GCP services, including BigQuery, Dataflow, and Cloud Storage and lakehouse layer interaction
- Production experience with orchestration tools such as Apache Airflow or Cloud Composer
- Experience designing solutions for both batch and real-time processing
- Strong command of data quality and governance, including data contracts, observability, and SLO-driven operations
- Practical application of data-as-a-product principles, including discoverability, lineage, quality guarantees, and versioning
- Proficiency with CI/CD practices, Git and GitHub, and modular development workflows
- Distributed processing familiarity (Spark/Kafka) and credible partnership with data engineering on end-to-end design
- Working knowledge of data privacy requirements and technical controls (PII handling, masking, encryption, retention, and access controls)
- Familiarity with cloud cost optimization across compute, storage, and query patterns
- Proven ability to influence across engineering, analytics, and business through technical credibility and clear communication; ability to explain architecture trade-offs to non-technical stakeholders
Just so you know: The hired candidate will be required to complete a background check
What Happens After You Apply Application review. It will happen. By an actual person in Talent Acquisition. We get upwards of 100+ applications for some roles, it can take a few days, but every applicant can expect a note regarding their application status.
Interview Process
- Round 1: 30-minute phone call with a member in Talent Acquisition
- Round 2: 75-minute video interview with the Hiring Manager covering technical, behavioral, and situational questions to evaluate your qualifications and fit for the role
- Round 3: 90-minute video interview with team members to delve into your technical expertise and experience
- Final Interview: 45-minute video interview with the VP
Generac is committed to fair and equitable compensation practices. The salary range for this role, based in Toronto, Ontario, Canada, is between $147,700 CAD - $191,900 CAD. This compensation will ultimately be in line with the location in which the position is filled. Final compensation for this role will be determined by various factors such as a candidate’s relevant work experience, skills, certifications and geographic location. This role is eligible for variable compensation, including short-term and long-term incentives.
This position includes a comprehensive benefit package that includes medical, dental and vision plans; life, long-term disability, flexible spending and health savings accounts, accrued paid time off, paid Holidays (10 for Ontario, 11 for British Columbia) and RRSP retirement benefits.
The Company is committed to improving accessibility for Canadian with disabilities and to ensuring that all our employees and applicants have the support and the tools they need to succeed. We have developed policies relating to human rights, accessibility, and accommodation, and provide all our employees with training on accessibility including under provincial legislation such as the Accessibility for Ontarians with Disabilities Act, 2005, either during orientation and/or on an ongoing basis. If you feel you need accommodation in relation to a disability in the application process or in the future, or have a question or concern about our policies, please reach out to askHR@generac.com