Sorry, the offer is not available,
but you can perform a new search or explore similar offers:

Data Analyst | Business Analyst | Fresher It Job - Fresher Can Apply Salary 45K

Job Details for Gurgaon, Noida and DelhiFull Time in MNCSalary- 6 to 10 LPADirect Apply @ Linkhttps://www.slaconsultantsindia.com/institute-for-data-analytic...


New Delhi

Published a month ago

Application Developer: Rdbms

Responsible for the design, development, coding, testing, debugging and documentation of applications to satisfy the requirements of user areasResponsible to...


From Ibm Careers - New Delhi

Published 25 days ago

Power Bi

Data Visualization PBI skills:Hands-on experience with Power BI desktop Dashboard and Report development with sound formatting and navigation SkillsStrong ap...


From Birlasoft Limited - New Delhi

Published 25 days ago

Telesale Executive

>Job Title: Tele callersJob Type: Full-time (WFO)Experience: 0-6 monthSalary: 12,000 to 15,000 per monthSchedule: Day shiftInterested candidates can connect ...


New Delhi

Published 25 days ago

Sr. Data Engineer - Data Platform

Sr. Data Engineer - Data Platform
Company:

Paytm


Details of the offer

About Paytm Labs:At Paytm Labs, we're on a mission to provide useful technological solutions that enrich and empower millions of people in their daily lives. We apply big data, artificial intelligence, and machine learning to bring the next generation of financial products and services to global markets. We recently soft-launched Pi, our full end-to-end fraud management system that can make real-time decisions. It works as a dynamic rule engine to make day-to-day analyses and procedures. Check out more about Pi here: pi.paytm.com .Job Description:If working with billions of events, petabytes of data, and optimizing for the last millisecond is something that excites you then read on! We are looking for a Senior Data Engineer who has seen their fair share of messy data sets and has been able to structure them for further fraud detection and prevention; anomaly detection and other AI products.You will be working on writing frameworks for real-time and batch pipelines to ingest and transform events from 100's of applications every day. These events will be consumed by both machines and people. Our ML and Software engineers consume these events to build new and optimize existing models to detect and fight new fraud patterns. You will also help optimize the feature pipelines for fast execution and work with software engineers to build event-driven microservices.You will get to put cutting-edge tech in production and the freedom to experiment with new frameworks, try new ways to optimize, and resources to build the next big thing in fintech using data!This position will operate on a hybrid model and be based in our Toronto office, mandatory in-office days are every Tuesday and one Friday every month. These designated days and frequency may change according to the company's discretion.Responsibilities: Work directly with the Platform Engineering Team to create reusable experimental and production data pipelines and centralize the data store.Unbox, deep-dive, understand, tune, and master the frameworks and technologies used day-to-day.Adopt problem-solving as a way of life – always go to the root cause.Keep the data whole, safe, and flowing with expertise on high-volume data ingest and streaming platforms (like Spark Streaming, Kafka, etc).Make the data available for online and offline consumption by machines and humans.Maintain and optimize underlying storage systems to perform according to the set SLAs.Sheppard and shape the data by developing efficient structures and schema for the data in storage and transit.Explore new technology options for data processing, storage, and share them with the team.Develop tools and contribute to open source wherever possible.Qualifications: Degree in Computer Science, Engineering, or a related fieldProficient in Spark/Scala/Python/Java.You are passionate about producing clean, maintainable, and testable code as part of a real-time data pipeline.You have worked with Spark and Kafka before and have experimented with or heard about Spark Streaming/Kafka Streams/Flink and understand when to use one over the other.You have experience implementing offline and online data processing flows and understand how to choose and optimize underlying storage technologies. You have worked or experimented with NoSQL databases such as Cassandra.You can connect different services and processes together even if you have not worked with them before and follow the flow of data through various pipelines to debug data issues.You have previously worked on building serious data pipelines ingesting and transforming >10 ^6 events per minute and terabytes of data per day.On a bad day, maintaining and bringing up a cluster doesn't bother you.You may not be a computer network expert but you understand issues with ingesting data from applications in multiple data centers across geographies, on-premises and cloud, and will find a way to solve them.Why join Paytm Labs:For the fifth year in a row, we are proud to announce that we have been certified as a Great Place to Work We are an open work environment that fosters collaboration, ownership, creativity, and urgencyWe ensure flexible hours outside of our core working hoursEnrolment in the group health benefits plan right from day 1, no waiting periodTeam building eventsWe support continued learning and self-improvementFuel for the day: Weekly delivery of groceries, and all types of snacksCatered desserts every month"We pooled our knowledge of the space and our world-class engineering talent to produce Pi. It's everything we wanted in an FRM, saving us hundreds of millions of dollars." - Harinder Takhar, CEOGo Big or Go Home!Paytm Labs believes in diversity and equal opportunity and we will not tolerate any forms of discrimination or harassment. Our people are critical to our success and we know the more inclusive we are, the better our work will be.We thank all applicants, however, only those selected for an interview will be contacted. Paytm Labs is committed to meeting the accessibility needs of all individuals in accordance with the Accessibility for Ontarians with Disabilities Act (AODA) and the Ontario Human Rights Code (OHRC). Should you require accommodations during the recruitment and selection process, please let us know.


Source: Lever_Co

Requirements

Sr. Data Engineer - Data Platform
Company:

Paytm


Built at: 2024-05-05T02:06:21.555Z