Geoblink logo

Data Science Intern

full-time Geoblink Madrid

About Geoblink

Were a fast growing startup that have already raised close to $8 million in investment from leading venture capital firms, and were named by Bloomberg as one of the 50 most promising startups in the world to look out for.

Our goal is to revolutionise the world of Location Intelligence and the way businesses think about, and act upon data. Getting to that goal means solving some of the most meaningful problems for businesses at present. It takes us a lot of determination, innovation and support from and for our colleagues.

Now we are once again opening the Geoblink Internship Program and looking for a bright individual who is passionate about using data to deliver great analyses, insights and predictive models. So if you are looking for an internship where you will learn new tech skills, work with Python frameworks and real data from a wide variety of sources you have come to the right place!

Internship Requirements

  • Experience with Object Oriented Programming in Python
  • Experience using Pandas package loading and manipulating data in Python
  • Experience writing SQL queries to retrieve data from databases
  • Comfortable working under a Linux distribution (such as Ubuntu)
  • Extra kudos if you have experience with GIT as version control system
  • Experience using scikit-learn library is not required but appreciated

What you can expect from Internship

As Data Science Intern, you will be part of the team responsible for making Geoblink’s app as scalable and accurate as possible. Geoblink relies on a large amount of spatial datasets to model urban behaviour, and you will be responsible for the automation of the processes of transformation and integration of the existing, and other new, datasets, from the different countries that we work in. You will also learn and do the following:

  • Build pipelines in Airflow for the ETL of different data sources (demographic, socioeconomic, urban, etc)
  • Improve the quality of the existing framework, improving the test coverage or improving the existing functionality
  • Be responsible for the integration of new data sources; from investigating the convenience of a new source, understanding the transformation of the data which is needed in case, to the final implementation of the global ETL process
published: March 20, 2019

Apply for this position