PropHero is seeking a detail-oriented Data & Analytics Engineer to own the complete data lifecycle, architect event-driven pipelines, and design snowflake-modeled data marts.
Key Highlights
Technical Skills Required
Benefits & Perks
Job Description
Company Description
PropHero, the AI & data-powered platform simplifying property investment, is on a fast-track to success! Backed by global VCs and founded by McKinsey Alumni, we're expanding our team constantly. Are you a detail-oriented professional with strong organizational skills? Join us in a transparent, respectful, and flexible work environment, surrounded by ambitious and good vibes individuals driving change. At PropHero, we're building the property investment future. If you're ready to contribute to our vision of making property investment as simple as investing in shares or ETFs, we want you on our team!
How Will You Contribute To PropHero?
As a Data & Analytics Engineer at PropHero, you will own the complete data lifecycle - from real-time ingestion to analytics-ready insights. You'll architect event-driven pipelines to stream data from external sources (HubSpot, APIs, webhooks) into PostgreSQL, then transform raw data into dimensional models and actionable dashboards. Working in an AWS ecosystem, you'll build the data infrastructure (Lambda, EventBridge, RDS) while also designing snowflake-modeled data marts and Metabase visualizations. This is a true end-to-end role with equal emphasis on both data engineering (50%) and analytics engineering (50%), where you'll bridge technical infrastructure and business intelligence, ensuring our teams have both reliable data pipelines and clean, business-ready datasets for property valuation models and market analysis.
What will make you succeed with us?
Experience:
- Event-Based Data Streaming: Design and implement event-driven pipelines using AWS services (Lambda, EventBridge, Kinesis/MSK, SQS) to ingest data from external sources in real-time.
- External API Integration: Develop robust connectors for third-party APIs, webhooks, and data sources, ensuring reliable data capture with proper error handling and retry logic.
- AWS Infrastructure Management: Deploy and manage AWS resources (Lambda, RDS, EventBridge, CloudWatch, S3) for scalable data solutions.
- Data Modeling (Snowflake Method): Design and implement dimensional data models in PostgreSQL using snowflake methodology, creating efficient fact tables, dimension tables, and slowly changing dimensions (SCDs).
- Data Transformation Pipelines: Build SQL-based transformation workflows to convert operational database tables into analytics-ready data marts, ensuring data consistency and business logic integrity.
- Data Marts Development: Create purpose-built data marts for different business domains (property valuation, customer analytics, market trends) optimized for analytical queries.
- Analytics & Reporting: Perform data analysis to answer business questions, identify trends, and deliver actionable insights to product and leadership teams.
- Metrics Definition: Partner with business stakeholders to define KPIs, metrics, and business logic; document metric definitions and calculation methods.
- Data Quality & Validation: Implement schema validation, data type checking, and automated quality gates at both the ingestion layer and transformation layer to ensure data accuracy and consistency.
- SQL & Database Optimization: Write efficient, performant SQL queries; optimize query performance and database design through proper indexing, query structure, materialized views, and connection pooling.
- Documentation & Collaboration: Maintain clear documentation of pipeline architecture, data flows, API integrations, data models, transformation logic, and metric definitions; work closely with distributed teams across different time zones.
- End-to-End Ownership: Take full ownership of data systems from ingestion to insights, ensuring seamless integration between infrastructure and analytics layers.
Requirements:
- 4+ years of experience in data engineering and analytics roles with proven end-to-end data pipeline ownership.
- Strong proficiency in Python for building ETL/ELT pipelines, API integrations, and data validation logic.
- SQL proficiency: Advanced SQL skills including window functions, CTEs, complex joins, and query optimization for PostgreSQL.
- Hands-on AWS experience with Lambda, EventBridge, Kinesis/SQS, RDS/PostgreSQL, CloudWatch, and S3.
- Event-driven architecture: Proven experience with event buses, message queues, webhooks, and streaming architectures.
- Snowflake data modeling: Deep understanding of dimensional modeling principles, star/snowflake schemas, fact/dimension tables, and normalization techniques.
- API integration expertise: Strong experience with REST APIs, authentication methods (OAuth, API keys), rate limiting, and error handling.
- BI tools experience: Hands-on experience with Metabase, Tableau, Looker, or similar BI platforms for dashboard development.
- Professional English proficiency: Excellent written and verbal English communication skills for technical documentation, code comments, analysis reports, presenting insights, and daily collaboration with Australia and Spain-based team members.
- Problem-solving: Analytical mindset with ability to debug complex pipeline issues, optimize query performance, and implement robust error recovery.
- Remote collaboration: Comfortable working remotely with distributed teams across different time zones, able to communicate complex technical concepts clearly in English.
- dbt experience (preferred): Familiarity with dbt or similar transformation frameworks for building modular, tested SQL pipelines.
- HubSpot or CRM API experience (bonus): Familiarity with CRM APIs (HubSpot, Salesforce, Zoho, or similar) is a strong plus.
- PropTech/Real Estate (bonus): Experience with property data, valuation models, real estate or construction project analytics is a plus.
What benefits can PropHero bring you?
- Impact & Ownership: Your work will directly shape the data foundation that drives our business decisions and customer experience.
- Cutting-Edge Stack: Build on a modern, cloud-native AWS data platform.
- Growth Opportunities: Be part of a fast-scaling team with opportunities to lead data architecture and mentor others.
- Healthy Business: €30M revenue in 4 years, 25% QoQ growth, already profitable.
- Fully remote working arrangement where you will be able to work flexibly while maintaining high deliverables.
At Prophero, we are committed to fostering an inclusive and equitable workplace where diverse perspectives and backgrounds are not only welcomed but celebrated. We believe that diversity drives innovation and empowers us to build stronger connections with our clients and communities. Prophero is an equal opportunity employer and is dedicated to ensuring a hiring process free from discrimination based on race, ethnicity, gender, age, disability, religion, sexual orientation, or any other characteristic protected by law. Our mission is to create a workplace where everyone feels valued, supported, and empowered to achieve their full potential.
Salary Range
Monthly: 2,000 - 2,600 AUD
Yearly: 24,000 - 31,200 AUD