Your mission
You will work on different projects to help Astrafy customers get the most out of their data. You will work on the business and technical sides of the customer use case.
- Design and maintain scalable data pipelines leveraging technologies such as Airflow, dbt, BigQuery, and Snowflake, ensuring efficient and reliable data ingestion, transformation, and delivery.
- Develop and optimize data infrastructure in the Google Cloud environment, implementing best practices for performance, cost, and security.
- Utilize Terraform and Kubernetes to automate infrastructure provisioning and manage containerized workloads, promoting agility and repeatability across environments.
- Implement robust data governance and quality measures, ensuring accuracy, consistency, and compliance throughout the data lifecycle.
- Collaborate with cross-functional teams to design and deploy Looker dashboards and other analytics solutions that empower stakeholders with actionable insights.
- Continuously refine data architecture to accommodate changing business needs, scaling solutions to handle increased data volume and complexity.
- Champion a culture of innovation by researching, evaluating, and recommending emerging data technologies and industry best practices.
In addition to these core responsibilities, we strongly value knowledge-sharing and community engagement. At Astrafy, we are active contributors on platforms like Medium, where we regularly publish articles to share our expertise, insights, and innovations, helping to grow both our community and industry presence.
As part of the role, the candidate will also be encouraged to write articles, offering valuable insights and thought leadership to enhance Astrafy's impact further and foster greater collaboration within our field.