Data Science Engineer
About Paidy Inc.
Paidy is Japan's pioneer and leading BNPL service with the mission to spread trust through society and to give people room to dream.
Paidy offers instant, monthly-consolidated credit to consumers by removing hassles from payment and purchase experiences. Paidy uses proprietary models and machine learning to underwrite transactions in seconds and guarantee payments to merchants. Paidy increases revenue for merchants by reducing the number of incomplete transactions, increasing conversion rates, boosting average order values, and facilitating repeat purchases from consumers.
Paidy continues to innovate to make shopping easier and more fun both online and offline. For more information, please visit http://www.paidy.com.
The Risk & Analytics Team lives at the core of the business model — figuring out how to take smart risk that allows for a smooth and empowering experience for all legitimate users, while also preventing fraud and protecting the company’s bottom line.
Data Science Engineers in Risk & Analytics work closely with data scientists and engineers to deploy new data structures and technologies to make the lives of our data scientists easier. We are looking for someone that will work closely with our data scientists and analysts, understand their needs, investigate technical solutions, and build and maintain tools, databases, and data pipelines to facilitate analysis, model building, monitoring and more.
Be part of the entire journey, getting to understand the needs of data scientists (even those not explicitly stated); building PoCs; collaborating with other engineers to setup infrastructure, build/develop and deploy solutions; and make sure they keep working.
Help bring new tools and data structures to production make the life of the data scientists and analysts easier.
Be an educator on how to use the tools and technologies you deploy, and help elevate the technological IQ of the department.
We are looking for someone who:
Enjoys problem-solving, learning new technologies, and helping others get their work done
Has worked as a data engineer, or as a particularly technical data scientist
Has experience in SQL, and Spark that goes beyond the basics
Has worked in Python, Scala, R or a similar programming language
Is comfortable working with existing code base, using git or similar version control
Nice to haves:
Experience as a data scientist, scientific researcher, or in a data analytics role
You know how to maintain and optimize a PostgresSQL database
Experience with AWS or cloud computing and cloud infrastructure in general
Worked with Terraform or similar infrastructure technologies
You have experience with using a batch job tool such as Prefect or Airflow
You are familiar with the concept of CI/CD (CircleCI, Jenkins, ...)
You have worked on a payment platform or other financial technology field
You have worked with and understand the concept of NoSQL databases and message brokers
Own it and deliver / 結果を出す
- Commit to what, when and how to deliver/ 目的・やり方・期限にコミットする。
- Own the actions to deliver / 結果のためのアクションにこだわる
- Embrace conflict when needed to deliver results / 必要なら対立・衝突も恐れない
Play an integral role / 大切なピースになる
- Make an irreplaceable contribution to our business / 替えの効かない貢献をする
- Embrace and bridge differences in language and culture / 皆が言語と文化の架け橋になる
- Raise the bar / スタンダードを上げ続ける
Be a winner / 勝ちにこだわる
- Beat expectations / 常に期待値を超える
- Display surprising speed / 人をスピードで驚かす
- Embrace risk / リスクを恐れない