- Key Primary skill Set:-
Below are the MUST have requirements that maps with the candidates working experience:
1. Experience in Big-data development
2. Strong knowledge on big data components, nodes and its architecture
3. Experience in scripting with Scala or python
4. Experience in data sourcing with kafka, flume, sqoop
5. Experience in pulling data from Database, files and real time streaming
6. Batch experience with any scheduling tool like control-M with Unix scripting
7. Good SQL experience with Hive and HBASE
8. Scala or python scripting knowledge on loading structured/unstructured data into Hive/HBASE/No SQL databases and performing DML operation with Hive and HBASE
9. Explosure to No sql database like Mango and couchbase
10. Scripting knowledge to validate, load and verify various file format likes XML, JSON etc
11. Experience in implementing Batch control/audit tables
12. Experience with workflow with oziee/falcon
13. Experience in inter cluster data copy
14. Experience in implementing ETL logics.
15. Good understanding on hadoop resource utilization, queue and tenant
16. Good experience on data warehouse projects. Experience with banking domain is added advantages
Malaysian/ Expats/Locally available Expats-Any
Notice Period/ Expected Start Date-Immediate
Employment Type-Contract
Salary- Max of 10,000 RM
Client-Bank (IT division) -Name will be disclosed once you show interest.
Location-KL Central, Malaysia.
Recruiter Name:Mr. Guna Sekaran
Email Address:valarmathi@saraswathyconsultancy.com
No comments:
Post a Comment
Please give your feedback or Job Request here