Job Description
Toronto
About AI & Analytics: Artificial intelligence (AI) and the data it collects and analyzes will soon sit at the core of all intelligent, human-centric businesses. By decoding customer needs, preferences, and behaviors, our clients can understand exactly what services, products, and experiences their consumers need. Within AI & Analytics, we work to design the future—a future in which trial-and-error business decisions have been replaced by informed choices and data-supported strategies.
By applying AI and data science, we help leading companies to prototype, refine, validate, and scale their AI and analytics products and delivery models. Cognizant’s AIA practice takes insights that are buried in data, and provides businesses a clear way to transform how they source, interpret and consume their information.
JOB DESCRIPTION
- Good experience in Unix and shell scripting ,hands on experience in Spark with Python, Scala
- Hands on with loading and manipulating large data sets using Spark
- SQL ,Hive knowledge on debugging and troubleshooting Hadoop jobs • Prepare implementation plan as per the need and build the in scope applications in Big Data technologies
- Responsible for all technical deliveries of the project
- Manage data related requests, analyze issues and provide efficient resolution. Design all program specifications and perform required tests.
- Prepare codes for all modules according to require specification.
- Monitor all production issues and inquiries and provide efficient resolution.
- Evaluate all functional requirements and map documents and perform troubleshoot on all development processes.
- Collaborate with application groups to prepare effective solutions for all programs.
- Documents all technical specifications and associate project deliverables.
- Design all test cases to provide support to all systems and perform unit tests.
- Good understanding of Agile, DevOps methodology
- Good communication and client interfacing skill
- Should have worked in offshore delivery model
- Experience in designing building and managing datamarts like processing in HIVE Spark
- Experience in Python programming
- Good experience in Unix and shell scripting
- Excellent analytical and technical skills with experience working in onsite, offshore modelAssociate should have experience working with Hadoop ecosystem (HDFS, Hive, Sqoop and SPARK). Associate should have experience in developing framework and ETL process using HIVE/SCALA Spark.
- Experience in designing, building and managing datamarts like processing in HIVE/SPark
- Experience in Python programming. Good experience in unix and shell scripting.
- Excellent analytical and technical skills with experience working in onsite/offshore model
- Responsible for all technical deliveries of project.
- Good understanding of Agile DevOps methodology Good Communication soft skills
Job ID: 95788