About the Company
IndexLab is a new research and intelligence company specialising in measuring the use of AI and other emerging technologies.
Businesses are racing to deploy AI in everything from agriculture to banking. They have the potential to transform every aspect of our lives and yet no one knows exactly what they’re doing or how they’re doing it.
That’s where we come in. We’re setting out to build the world’s first index to publicly rank the largest companies in the world on their AI maturity, using advanced data gathering techniques across a wide range of unstructured data sources.
We are just starting out and are looking to hire multiple data scientists to help build our first Index.
About the Role
This is an exciting role where you will be working across the entire product lifecycle and have the opportunity to shape the product from the start.
This role would be ideal for someone who likes to consider the context of the data they are working with and enjoys testing out new ideas to improve existing processes and products.
Work with researchers to scope a research area, evaluate the potential data sources, and find reliable ways to quantify that data.
Build tools and scripts to collect data from a variety of sources via web scraping, third party APIs, and machine learning. Then work with our Data Engineers to get it into production.
Analyse the database you’ve helped to build to provide insights for research products, and then turn your findings into predictive models to forecast trends in the industry.
Help to develop internal packages and software that will make researcher and data scientist lives easier.
Translate research, insights, and final index results into clear and interesting dashboards that will become part of our first product offering.
Comfortable working independently on semi-structured data problems.
3+ years of analysing data and presenting clear insights to both technical and non-technical audiences.
Demonstrable experience using dashboarding tools such as Tableau, Looker, Shiny or Dash.
Experience, or interest, in web-scraping and data collection techniques.
Nice to have skills:
Experience in using Machine Learning techniques for text extraction, NLP or computer vision.
Experience in using the Google Cloud suite of products such as big query, and building data models or pipelines.
Interest in web development - frontend (React) or backend (Django)
Experience working in a start-up.