Data Engineer

  • Data
  • Amsterdam, Netherlands

Data Engineer

Job description

The rapid growth of Dealroom.co requires a diligent Data Engineer contributing to the core of our product and company; the data platform. As part of the Data team, you’ll help expand and improve our data product. The Data Engineer’s main objective is to develop our data architecture and data pipelines.

Together with the team you will design and deploy sustainable data pipelines using primarily Python, Google Cloud Platform tools and Airflow. In the near term, the team would like to reach a very modern infrastructure with automated solutions to process our data, driving more value for our users as we provide them with increasing volume and quality. In this role, you’ll act to drive and resolve all necessary infrastructure needs while cooperating with our backend team, in pursuit of our clearly defined data roadmap. Strategic thinking is required to bring the platform to the next level.

At Dealroom.co we value intelligent, articulate, and motivated individuals who want to learn, work with a great team and rapidly growing business, but who also thrive working independently with clearly defined responsibilities.

About Dealroom

Dealroom is Europe’s leading provider of startup data & insights. The Dealroom software, database and research enables Venture Capital Firms, Corporates, Advisory firms and Governments to stay at the forefront of innovation, discover promising companies and identify strategic opportunities.

Our core products are:

1/ The Dealroom SaaS platform offers an intuitive interface to identify and track promising startups and analyse broader industry/business model trends.

2/ The Dealroom API, enabling our clients to integrate Dealroom data straight into their own applications and processes.

3/ Custom research projects in which we combine Dealroom data with expert insights. The scope of these projects range from market scans and industry deep-dives to high-level analyses of trends on startups, employment and venture capital.

4/ Dealroom Ecosystem Solutions, which provide an open (and crowdsourced) platform to support local governments, founders and investors to keep track of activity in their local tech ecosystem, showcase it to the world and foster networking. For example in Berlin, The Netherlands and the United Kingdom.

Among our clients are world-leading firms including Mckinsey, BCG, Deloitte, Google, Amazon, Microsoft, Sony, Sequoia, Insight VP, Balderton Capital, EQT, and many others that rely on Dealroom Data as part of their strategic decision-making.

Team

We operate from the heart of Amsterdam. We’re passionate about everything that is happening around technology, start-ups, and scale-ups, and aim to capture this in actionable data points for our users. Our company culture is ambitious, diligent, and proactive, but also fun and playful including weekly team drinks.

The Data team consists of 15 team members covering AI, data science, data engineering, BI, and data quality.

Job Description

  • You will develop, maintain and test data pipelines that capture, integrate, and centralise the various data flowing from all our systems and processes

  • You will design, develop, and improve data and system architecture

  • Work with Data Scientists and Data Analysts to assist with the implementation algorithms and models, and data infrastructure

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Phyton and GCP ‘big data’ technologies (Dataflow, Pub/Sub, BigQuery, Dataproc, Airflow).

  • Maintain and improve the existing data pipelines and data products.

  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

What we can offer you

  • A competitive salary

  • Freedom and responsibility to execute on your own ideas

  • Positive, empathic, and outward-focused colleagues and customers that are at the forefront of innovation and technology.

  • A great office, in the heart of a vibrant neighborhood.

  • Perks designed to set you up for success in a new reality:
    • Apple or Windows laptop of your choice

    • Flexible working hours

    • Home-office setup (screen, etc)

    • Travel reimbursement

  • Additionally:
    • Take your birthday off

    • Eligible for Employee Stock Units plan

    • Learning & development budget

  • Weekly Yoga & work-out sessions

  • More to come!

Job requirements

Who you are / Background

  • You have a university degree in computer science (bachelors or masters)

  • 4+ years of industry-relevant experience building data architectures using batch and streaming technologies. Bonus for experience with GCP.

  • Highly skilled in Python, Git, and MySQL

  • Experience designing ETL and data pipelines

  • Experience with CI/CD tools (CircleCI/Terraform)

  • Experience with cloud infrastructure & cloud data warehouses and services (preferably Google BigQuery & Kubernetes)

  • Experience with workflow orchestrator tools (e.g. Airflow)

  • Bonus for experience with stream data pipeline programming models such as Spark, Apache Flink, Apache Beam

  • You are meticulous and creative

  • You have a passion for technology, startups, and venture capital markets.

  • Comfortable with working in a multicultural and fast-growing start-up work environment

  • Self-motivated, detailed, ability to prioritize, proactive and ambitious

  • Fluent in English. Any additional European language is a big advantage