About Job
CTC Undisclosed Job Location Canada Experience 5 - 8 yrs
Description
Required Skills Extensive experience with Big Data and distributed systems
System design skills. Ability to design large scale distributed systems
Prior experience of building data platforms using Big Data stack (Kafka, Hadoop, Spark, Flink, Hive ..) on public cloud
Excellent programming skills in Java and/or Scala
Experience with stream processing using Spark or Flink
Understanding of distributed systems concepts and principles (consistency and availability, liveness and safety, durability, reliability, fault-tolerance, consensus algorithms)
Deep understanding of Algorithms, Data Structures, and Performance Optimization Techniques
Eager to learn new things and passionate about technology
Comfortable working with Kubernetes, AWS, Docker, and Terraform.
What you would do Design, develop and run cloud native data platform and analytics SaaS services
Own architecture and provide technical leadership to multiple teams
Hands-on coding >60 PERCENT of the time
Design and build large scale real-time stream processing systems
Design, develop and run micro-services and analytics SaaS solutions
Do test driven unit and end to end testing of any code you develop.
Own Continuous Integration (CI) and Continuous Deployment (CD) for your services
Own scalability, availability and data security for your services
Own, troubleshoot & resolve code defects
Mentor other developers in best practices
What you would need to succeed Prior experience and passion for building large scale multi-tenant cloud native data platforms
Emphasize team wins over individual success
Strong technical communication skills
Excellent software development skills in one or more of the following languages: Java/Scala
Extensive experience with Big Data and distributed systems. Expertise in Spark or Flink, Kafka and Hadoop ecosystem
System design skills. Ability to design large scale distributed systems
Have developed in more than one language and ready to pivot to any language/framework
Understand REST API for data interchange. Understand API-driven system designing
Understand micro services architecture patterns pattern like Service Discovery/API Gateway/Domain Driven Design etc
Understand Serverless function and their relevant use
Ability to work in an agile fast paced environment
BS or MS degree (Computer Science or Math)
5 years relevant work experience
Refer to Required Skills section more details
Bonus AWS (EMR, S3, Glue, Kinesis..)
ELK
Experience of building SaaS/PaaS on AWS/GCP/Azure..
AI/ML
Please find more exciting stuff about our Data Platform Team in the following Blog : https://medium.com/guidewire-engineering-blog/introducing-guidewire-data...
About Guidewire Guidewire is the platform P&C insurers trust to engage, innovate, and grow efficiently.
Guidewire combines core, data, digital, analytics, and AI to deliver our platform as a cloud service. More than 400 insurers, including the largest and most complex in the world, run on Guidewire.
Job ID: 83174
A Typical Work Day May Include: • Completing preventative, predictive, ...
Are you looking to elevate your cyber career? Your technical skills? Your opport...
Cargill Animal Nutrition is a global business that serves large-scale feed mill ...
Primary Duties / Responsibilities:â— Assist in daily operational troublesho...
