****THIS POSITION REQUIRES MOVING AND LIVING IN CALGARY, AB, CANADA.
This job is hybrid remote in Alberta, if you are accepted, you will need to be ready to relocate from Mexico to Canada in 2-4 weeks after the paperwork.
WHO WE ARE
We are culture builders. We believe and invest in people and use technology, data and organizational psychology to help companies build great work cultures.
We started in Mexico in 2006, built an incredible team for 15 years and were acquired in 2021.
Now we’re starting again in the beautiful city of Calgary to help Alberta Tech companies build cultures people will love to work for.
WHO YOU ARE
- You are humble. You share the credit and take pride in your team’s success. You accept responsibility for your mistakes and don’t blame others. You don’t think less of yourself, but you think in yourself less.
- You are passionate. But passionate about technology and building great things, not only passionate about sports or music (although there is nothing wrong with that, we too love Netflix and video games).
- You are empathetic. You are aware of what you say and do, and how that impact your teammates and clients. Sorry, but nobody wants to work with jerks (at least not us, and for sure, you don’t want an emotionless manager as your boss), no matter how genius you are.
PURPOSE-DRIVEN TECH
If you want to work for companies that are making an impact, if you want to be part of something greater than yourself, and not just work…we are here to help. Come and join the purpose-driven tech revolution.
YOUR SKILLS
Well, it is not only about romantic things and building a better world; you need to have some tech skills too:
- Proven experience implementing data engineering solutions with Databricks. Databricks Data Engineering badges are a plus.
- 5+ years of experience using Python/PySpark and/or similar modern languages to automate data pipelines is required.
- 4+ years of experience with Azure Data Factory (preferably) or Airflow or Databricks Jobs (Workflows) for data pipeline orchestration.
- 4 years experience with Hadoop data lakes.
- 2 Years experience with Databricks Delta Lake, and Lakehouse.
- Experience implementing other cloud native database platforms such as Snowflake is an asset.
- Proficiency with PySpark, SQL is required.
- Proficiency with and Scala is an asset.
- Experience with DataOps, CI/CD, DevOps or other Agile delivery approaches is an asset.
- Experience with data science or data modeling is an asset.
Don’t worry if you don’t have all the requirements of the job description. We know that there is no such thing as a “perfect” candidate. Due to company requests, we ask you to have at least 6 years of experience and that’s it, apply to this role if you feel identified.