Description
About Us
Sunbit builds financial technology for real life. Our technology eases the stress of paying for life’s expenses by giving people more options on how and when they pay. Founded in 2016, Sunbit offers a next-generation, no-fee credit card that can be managed through a powerful mobile app, as well as a point-of-sale payment option available at more than 25,000 service locations, including auto dealership service centers, optical practices, dentist offices, veterinary clinics, and specialty healthcare services. Sunbit was included on the 2022 Inc. 5000 list. The financial technology company has also been named a Most Loved Workplace®, Best Point of Sale Company, and a Top Fintech Startup by CB Insights. We use cutting-edge innovations in financial technology to bring leading data and features that allow individuals to be qualified instantly, making purchases at the point of sale fast, fair, and easy for consumers from all walks of life. We create value focused on our core values; we work tirelessly to ensure that Sunbit becomes available to everyone, everywhere.
What You’ll Do:
- Create new data solutions, maintain existing and be a focal point for all technical aspects of our data activity. You will develop advanced data and analytics to support our analysts and production with validated and reliable data. The ideal candidate is a hands-on professional with strong knowledge of data pipelines, and an ability to translate business needs into flawless data flow.
- Create ELT/Streaming processes and SQL queries to bring data to/from the data warehouse and other data sources.
- Own the data lake pipelines, maintenance, improvements, and schema.
- Create new features from scratch, enhance existing features, and optimize existing functionality.
- Collaborate with various stakeholders across the company, like data developers, analysts, data science, etc., to deliver team tasks. Work closely with all business units and engineering teams to develop a long-term data platform architecture strategy.
- Implement new tools and development approaches.
- Ensure adherence to coding best practices and development of reusable code
- Constantly monitor data platform and make recommendations to enhance system architecture On both ETL\ELT and real-time pipelines
Requirements
- 3+ years of experience as a Data Engineer
- 3+ years of direct experience with SQL (e.g., Redshift/Postgres/MySQL, Snowflake), data modeling, data warehousing, and building ELT/ETL pipelines - MUST
- 2+ years of Python
- 3+ years of experience in scalable data architecture, fault-tolerant ETL, and monitoring of data quality in the cloud
- Experience working with cloud environments (AWS preferred) and big data technologies (EMR, EC2, S3, Snowflake, spark-streaming, DBT, Airflow)
- Exceptional troubleshooting and problem-solving abilities, debugging, and root-causing defects in large-scale systems.
- Deep understanding of distributed data processing architecture and tools such as Kafka, Spark, and Airflow
- Experience with design patterns and coding best practices, understanding of data modeling concepts, techniques, and best practices
- Proficiency with modern source control systems, especially Git
- Basic Linux/Unix system administration skills
Nice to have
- BS or MS degree in Computer Science or a related technical field - An advantage
- Experience with data warehouses
- NoSQL, Large scale DBs.
- Understanding fintech business processes
- DataOps - AWS.
- Microservices
- Experience in DBT
What Else
- Energetic and Data enthusiastic
- Analytical
- Self-motivated and work well both independently and as part of a team
- Excellent verbal/written communication & data presentation skills, including experience communicating with business and technical teams.
- You are a team player with solid communication skills.
- Love to explore new technologies and Fast and self-learner: you can quickly master concepts, disciplines, and methods.