IT/Telecom Jobs in Kenya
At Sendy, we specialize in connecting on-demand, trusted, and transparent service providers with individuals and businesses looking to move packages. Thus, we believe it’s a natural extension of our core expertise to also connect our users to quality certified transportation providers. As of today, Sendy users can now request a package DELIVERY using a motorcycle, van, pickup, or 3 ton truck, or a passenger RIDE with a boda boda or cab – all from within the same SENDY mobile app (available on Android and iOS) and web platform. All DELIVERY and RIDE services are available 24 hours a day, 7 days a week and can be paid for using cash, MPesa, or card.
About the Role
The data engineer will be in charge of expanding and optimizing our data architecture and data pipeline with the purpose of improving data analysis and ML predictions. The data engineer will be handling the design and construction of scalable data systems and also research new use cases for data acquisition.
Key Duties and Responsibilities Develop and update the data storage strategy in line with the overall business strategy. Create and maintain optimal data pipeline architecture with focus on ML. Improving existing setup with Snowflake. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and other ‘big data’ technologies. Work with stakeholders including the Product, Data and Growth teams to assist with data-related technical issues and support their data infrastructure needs. Improve access speed and optimize data storage for ML tasks. Build high-performance algorithms, predictive models, and prototypes. Any other duties that may be assigned by the supervisor.
About You Bachelor’s degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field Minimum two years’ experience in a data engineer role or similar. Experience with cloud data warehousing like snowflake Advanced working SQL knowledge and experience working with relational databases, SQL as well as working familiarity with a variety of databases. Experience building and optimizing ‘big data’ data pipelines, architectures and data sets: i.e. Hadoop, Spark, Kafka, etc. Experience with relational SQL and NoSQL databases, including Postgres. Experience with data pipeline and workflow management tools: i.e. Azkaban, Luigi, Airflow, etc. Experience with object-oriented/object function scripting languages: i.e. Python, Java, etc. Strong analytic skills related to working with unstructured datasets. Strong project management and organizational skills. Experience supporting and working with cross-functional teams in a dynamic environment.
What We Offer Comprehensive health insurance – Inpatient / Outpatient / Dental / Optical Flexible vacation All risk Insurance Office lunch Opportunity for company stock options