Proptech Start-Up looking for Data Engineer – yuvoh.com
From individuals buying single properties to multi-billion-euro transactions between financial institutions, we’re leveraging big data and analytics to transform the property investment process.
At Yuvoh, everything starts with the data. We collect vast quantities of it every single day from many sources around the world. We then use this data to create analytics and actionable insights for many different markets and use cases.
Our UK property investment portal (yuvoh.com) enables anyone to use our analytics to make informed property purchases, as well as providing a hands-off approach to investing in real estate.
Our rapidly growing B2B business (yuvoh-analytics.com) provides valuations, data-enrichment, and market analytics to banks and financial institutions using cutting-edge machine learning techniques – something that has never been done before in many of the markets we operate in.
What is the team like
What matters to us most is understanding and solving our customers’ needs. So you can expect a fast pace and varied challenges being tackled in a collaborative atmosphere. Our team thrives in a friendly, relaxed and open culture, which allows people to share ideas across the business.
What the company can provide to the applicants
- Practical skills with mentoring from tech and banking veterans and the ability to work in a fantastic new Fintech startup company.
- Commitment to foster a strong working relationship for these applicants with the whole team.
- Freedom to develop creative and interesting software from the ground up as part of a dynamic team.Interesting tasks and modern technologies.
We are looking for a data-driven and highly proactive individual who likes to find solutions to problems, and deliver data science projects.
You should be comfortable with the following:
- Understand the key questions and problems facing our clients
- Implement new solutions to support clients
- Help clients to gain more insights from using advanced data enrichment
Key requirements for the role:
- 2-3 years of experience as Data Engineer (Big Data)
- Experienced in using Python for data pipelines building and Big Data transformations
- Good knowledge of SQL
- Experience with the Hadoop ecosystem (e.g. Hive, Spark, Presto)
- Experience building data integrations using Airflow
Desirable to have
- Previous experience of working with geospatial data
- Experience working with Google Cloud
- Docker, K8s
- Experience working with streaming technologies (e.g. Kafka, Confluent Cloud)
- Develop and maintain optimal data pipeline architecture
- Identify ways to improve data reliability, efficiency and quality
- Prepare data for predictive and prescriptive modeling
- Create data tools for data scientist team members
- Taking part in the decision-making process in design, solution development and code review
- Working in an international distributed (within Europe) team
- Delivering the product roadmap and plannings, create estimations
Computer Science or related degrees are preferred. If you are a great problem solver with a passion for learning new skills looking for the chance to create powerful, innovative solutions in a fast-moving start up, then get in touch with us at email@example.com. Successful applicants can start immediately.