Duties and Responsibility
Experience – 5 Years to 10 Years
· At least 4 Years of Strong experience in Big data Hadoop environment.
· Insurance Domain Knowledge
· Hands-on experience in working with Kylo,Hadoop distribution platforms like Hortonworks,Cloudera ,MapR and NiFi
· Should possess hands on experience in scripting language like Shell/Phyton /Perl.
· Ability to implement and Customize data ingest pipelines.
· Ability to customize Ni-Fi templates
· Define categories for feeds along with source and destination
· Implement authentication and authorization using Apache Ranger and Active Directory
· Implement Jobs using Apache Spark
· Implement Batch data and Real time data processing
· Should be able to Cleanse and validate data
· Experience in Schedule and Monitor feeds
· Develop code and should be able to perform unit testing
· Should possess good communication skills and exhibit leadership skill