CANDIDATES MUST HAVE ESTABLISHED THE LEGAL RIGHT TO WORK IN THE US- NO C2C.
This person will get first-hand experience in the design, architecture and development of enterprise big data Hadoop platform. This fast-paced role requires an architect that possesses not only excellent analytical skills, but also a creative spirit and excellent communication skills. Providing dynamic service as a Technical Architect using Big Data, Hadoop and Hortonworks, the individual will solve technology problems, meet client requirements from analysis to implementation, and enable their strategies. The individual will have the opportunity to work in challenging and dynamic environments which will continue to enhance their skill set. Key responsibilities include:
Participating in the architecture design and development of enterprise big data platform team, is primary responsibility
Translate business requirements and enhancements to system-level technical modifications.
Perform complex analysis, assessment, design, configuration and programming functions
Collaborate with team and key stake holders
Assisting in the researching function by quickly developing models that provide an accurate representation of the business use cases as requests arise.
Working independently on Ad Hoc requests reporting results in a timely and professional manner
Interacting with business stake holders.
At least 10 to 12 years of overall IT experience and 5 to 6 years’ relevant experience Architect the complete end-end to design of enterprise wide big data solution
Should be able differentiate and recommend different tools that can be used to solve business problems
Design and Develop big data solutions using Hadoop and Spark platform
Hortonworks distributions experience is must
Amazon Web services Experience is must
Big data development experience on Hadoop platform including Hive, Hive LLAP, Sqoop, Flume, Spark.
Application development experience in Core Java/ Python
Strong Core Java experience
Should be able to cover end-to-end BI and Data strategy inclusive of partnership with internal and external stakeholders.
Experience with data modeling, complex data structures, data processing, and data quality and data lifecycle.
Should be able to lead critical aspects of the data management and application management.
Experience in UNIX shell scripting, batch scheduling and version control tools.
Experience with analytical programming and ability to work with EDW architecture to bridge the gap between a traditional DB architecture and a Hadoop centric architecture.
Highly organized and analytic, capable of solving business problems using technology.
Should be an individual with in-depth technical knowledge and hands-on experience in the areas of Data Management, BI Architecture, Product Development, RDBMS and non-RDBMS platforms.
Should have excellent analytical skills, able to recognize data patterns and troubleshoot the data.
Will be responsible for design and delivery of data solutions to empower data migration initiatives, BI initiatives, dashboards development etc.
Will design ETL processes, must be able to explore the POC/prototype options.
Experience building solutions for streaming applications is a plus
Have a thorough understanding of the implications of software design and implementation choices on performance and maintainability.
Experience in large scale server-side application development that includes the design and implementation of high-volume data processing jobs.
Functional programming experience is desired.