Descrição da Vaga
Loggi is building an entirely new class of logistics: a last-mile solution powering same day delivery and blurring the line between online and offline for retail.
We’re working hard to leapfrog hundreds of years of poor infrastructure investment. We are positive that our effort is a net gain for the world and that it is, and will keep on making, a large impact in emerging markets. Our work is original, hence our challenges are unique. Your work is reflected daily in the thousands of drivers whose life Loggi is changing, as well as in the many customers who are surprised to receive timely and predictable service.
Our product has incredible complexity as it weaves online and offline elements, software and physical services, and interfaces with both business and end-consumers. It is the focal point for an operational, sales and engineering-heavy company.
Since launch, we are growing aggressively. We’re laser focused on product and user experience, and intend to stay that way.
We are looking for a software engineer who is responsible for designing, implementing and maintaining a large-scale low latency data pipeline that will influence all the business decisions at Loggi!
Loggi believes different people think differently, which leads us to diverse solutions and new ways of making it happen. We are committed with diversity in multiple ways: race, gender, sexual orientation, gender identity, age, religion and disability. Join the transformation!
What you will do:
- Build large-scale batch and real-time data pipelines with data processing frameworks/systems like Druid, Hive, Presto, ClickHouse, Spark
- Build tools that will monitor our data/metrics pipeline healthiness and alert if something is astray
- Collaborate with other engineers, Product Managers and Stakeholders to make sure the data/metrics being collected is correct according to the business scenario
- Leverage best practices in logging and metrics collection across all sub-systems of the platform
- Keep a concise, clear and comprehensible documentation of the systems you own
Who you are:
- You know how to work with high volume heterogeneous data, preferably with distributed systems such as Kafka, Hadoop, Cassandra, ElasticSearch, BigTable
- You have strong experience with data modeling / access / storage techniques
- You have strong analytical and diagnostic skills
- You have strong scripting skills - python, perl, shell, etc
- Freedom to think and explore.
- Fun, laid-back environment in a fast paced Startup.
- Feedback and brainstorming directly with the founders. No bureaucracy, no bull****
- Competitive compensation based on your experience
- Choice of tools
- Score 10.5 out of 12 on Joel’s test
- We will gift to you a technical updating conference of your choosing, after completing one year as a Logger
Apply to this post!
PS: Applicants from anywhere are very welcome, however, this is an onsite position and you must be legally allowed to work in Brasil