👉 Please reference you found the job on Remote Software Engineering Jobs, this helps us get more companies to post here, thanks!
The best free resources for coding interviews. Period.
The same resources that successfully helped engineers get offers from Google, Microsoft, and Amazon. With proven results and used by over a million people all around the world...
Sift is the leading innovator in Digital Trust & Safety. Sift helps to stop fraud before it happens. Hundreds of disruptive, forward-thinking companies like Coinbase, Zillow, and Twitter trust Sift to deliver an outstanding customer experience while preventing fraud and abuse.
Sift is a Series E company with a valuation of $1.7 billion as a unicorn in 2021. Sift acquired 2 startups: Chargeback and Keyless to extend the company's product portfolio. Sift was nominated as the Best Employer in 2020 in Seattle.
Sift is a big data ML-based platform that processes 70B API requests per month, processes 1PB of data, and tens of thousands of transactions per second.
Sift mission: Help everyone trust the Internet.
Let’s Build It Together:
At Sift, we are intentionally building a diverse, equitable, and inclusive workplace. We believe that diversity drives innovation, equity is a fundamental right, and inclusion is a basic human need. We envision a place where all Sifties feel secure sharing their authentic selves and diverse experiences with their teams, their customers, and their community – ultimately using this empowerment and authenticity to build trust and create a safer Internet.
The Data Platform team is responsible for making Sift’s data easy to use, understand, and communicate. This team ensures the availability, correctness, and data privacy compliance of information critical for Sift’s day-to-day operations. Our customers include not just Sift’s data science product teams, but also our sales, services, and business operations teams. We are excited about our plans to build our next-generation data analytics solution.
Our R&D team consists of over 100 people, 35 of them are based in the Kyiv R&D office. We are going to have 3 Software engineers in the Ukraine R&D team who will be part of our Data Platform Team.
Data Platform technical stack:
- Java 11
- Python 3
- DataProc, Spark
- Apache Airflow
Other Sift Products technical stack: Hadoop, Flink, AWS, Ruby, RoR, FE: React.js
We use Scrum and 2 weeks sprints.
Opportunities for you:
- Professional growth: quarterly Growth Cycles instead of performance review
- Experience: knowledge sharing through biweekly Tech Talks sessions. You will learn how to build projects that handle petabytes of data, have small latency and high fault tolerance.
- Business trips and the annual Sift Summit.
- Hybrid work approach: you can choose where you work better remotely or in the office
What you’ll do:
- As a senior software engineer on Sift’s Data Platform team, you will build data warehousing and business intelligence systems to empower engineers, data scientists, and analysts to extract insights from data.
- You will design and build Petabyte scale systems for high availability, high throughput, data consistency, security, and end-user privacy, defining our next generation of data analytics tooling. You will do data modelling and ETL enhancements to improve efficiency and data quality.
What would make you a strong fit:
- Experience working with large datasets and data processing technologies for both stream and batch processing, such as Apache Spark, Apache Beam, MapReduce, Flink.
- Experience with complex SQL and ETL/ELT development
- Experience designing and building data warehouse, data lake, or lake house solutions
- Experience with distributed systems and distributed data storage.
- Experience with large-scale data warehousing solutions, like BigQuery, Snowflake, Redshift, Presto, etc.
- Professional software development experience or a degree in CS (or a related field).
- Strong communication and collaboration skills particularly across teams or with functions like data scientists or business analysts.
- Python, Java, or similar languages experience
- Cloud infrastructure (e.g. GCP, AWS) experience
- Experience working with big data processing infrastructures like AWS Elastic Map Reduce GCP Dataproc or Dataflow.
- Experience with workflow orchestrators (Airflow, Cloud Composer)
- Experience with the analytics presentation layer (Dashboards, Reporting, and OLAP)
- Experience with designing for data compliance and privacy
Our Hiring Process
We follow the same process for all teams, technical interview consists of 2 parts:
- 45 min technical phone interview with the engineering manager, 1 coding task Leetcode medium level like;
- Virtual on-site interview: 4 sessions, 45 mins each, that cover coding, system design, experience, and soft skills.
During our sessions, you will have the opportunity to learn about company culture, meet engineers from your team and discuss distributed system problems. You will have time for all interesting questions and get transparency regarding your future responsibilities and the project.
- A compensation package that consists of financial compensation, a biannual 5% bonus, and stock options;
- Medical, dental, and vision coverage;
- 50$ for sports and wellness;
- Education reimbursement: books, education courses, conferences;
- Flexible time off: we follow an unlimited vacation approach;
- Tuned work schedule to Kyiv timezone despite US offices location: biweekly demo sessions are optional for our team and we watch them from recording;
- Mental Health Days: additional day-offs;
- English courses and social activities inside the company that allow improving your public speaking and language.