About Me

I am a data scientist with a software engineering background passionate about learning and technology. My main interests are Machine Learning and Data Science, which have been the core of my early career.

Education

BSc in Computer Engineering

2016 - 2019
University of Minho
  • Courses: Calculus, Linear Algebra, Algorithms and Complexity, Operating Systems, Computer Architecture, Functional Programming, Imperative Programming, Object-Oriented Programming, Numerical Optimization.

MSc in Computer Engineering

2019 - 2021
University of Minho

Experiences

Data Scientist at Uphold Inc. 2021 - Present
  • Conducted in-depth data analysis and developed dashboards used by leadership as well as by the Product, Marketing, Business Development, and Financial teams to enhance targeting, increase revenue, and allocate resources more efficiently based on key business metrics.
  • Implemented a recommender system to help millions of users diversify their cryptocurrency investment portfolios.
  • Designed A/B tests and other controlled experiments to continuously improve features and analyze the impact of new releases. Advocated for the systematic and methodical use of this approach as the ideal standard for decision-making.
  • Played a key role in the relationship between the Data and Product teams pursuing a data-driven product development process.
Main Tools: Python, SQL (PostgreSQL & Snowflake), Looker, Metabase, pandas, sklearn.
Data Scientist, ML Engineer at Anybrain 2020 - 2021
  • Designed a system for cheat detection and continuous authentication in video games using deep learning and multivariate time series.
  • Developed an end-to-end data science & machine learning pipeline:
    • ETLs processing user input data (MongoDB, Python, pandas, NumPy);
    • Data cleaning and feature engineering to produce a dataset of multivariate time series;
    • Training, hyperparameter optimization, and evaluation of CNNs in fraud detection and continuous authentication (TensorFlow + Keras, sklearn, Optuna);
    • Serving the trained models through a RESTful API (Java, TensorFlow Java API, SpringBoot, Docker).
Main Tools: MongoDB, Python, pandas, numpy, TensorFlow + Keras, Optuna, dataiku, Java, Docker, SpringBoot, H2O.ai
Business Intelligence Intern at XpandIT 2019 - 2019
  • Helped extend the company’s internal framework by developing a dashboard to monitor ETL execution logs. The ETL processes were designed, managed, and executed using Pentaho Data Integration.
Main Tools: Pentaho Data Integration