Data engineering for reliable reporting
I build ETL pipelines and KPI dashboards on Databricks and Power BI for finance and management teams.
- Databricks ETL
- KPI Reporting
- Data Quality & Monitoring
Identity
Let's make data work for your business.
How I work
- Translate stakeholder requirements into clear metrics + definitions
- Build reliable Databricks ETL (Python/PySpark/SQL) with reusable patterns
- Validate + monitor data quality to keep reporting consistent
- Deliver KPI-ready models powering Power BI dashboards for finance/management
Values
What to expect when we work together
-
Data quality by default
Automated checks + alerts keep KPIs trustworthy.
-
Reusable pipeline design
Modular building blocks speed up change requests.
-
Business-first reporting
Metrics align with stakeholder decisions + governance.
-
Clear documentation
Shared definitions keep teams aligned and reduce churn.
Skills and toolset
Focused on Databricks, BI reporting, and reliable ETL delivery.
Languages
Platforms
Tools
Practices
Experience
A quick view of the roles that shaped my approach to data engineering.
11/2023 - Present
Data Engineer
BNP Paribas Lisbon, Portugal
Design and maintain ETL pipelines in Python, PySpark, and SQL, plus Power BI dashboards for finance and management.
08/2021 - 10/2023
Data Engineer
Protegrity Lisbon, Portugal
Built Databricks ETL workflows and cross-team dashboards for revenue, pipeline, and product usage KPIs.
11/2020 - 08/2021
Data Analyst
Megabeetle - Branding Agency Lisbon, Portugal
Produced performance reports and supported a SQL case management tool for insolvency workflows.