Data Engineer (Spark & Microsoft Azure Cloud) Up $7.5k
The Opportunity
- Our client is an established US MNC in FMCG
- Exciting work opportunities in one of the largest firms
- Attractive remuneration package
The Talent
- Skilled in programming language in either Java or Python
- Good at SQL and have ability to build complex queries
- Strong hands-on development experience in Apache Spark/Hadoop, Hive
- Good Hands-on on data services in one of public clouds:
- o Aws cloud (glue, Athena, s3, EMR, Kinesis, Redshift, Lambda, API Gateway, CloudFormation)
- o Microsoft azure cloud (preferred) (data factory, HDInsight, blob Storage, App functions, API Management, Resource Manager, SqlServer)
The Job
- Responsible for lifecycle of data , right from the data ingestion from various channels, across data processing layers of Consumer Data Platform and serving to multiple systems and BI tools for analytics and reporting purposes .
- Good understanding of end-to-end data processes and flows; design and implementation of data solutions.
- Working with multi-functional colleagues in marketing/ IT/external IT vendors to define requirements in an iterative manner, leveraging story boards & prototypes
- Contributing towards the development of the relevant solution along with other development team members and vendors (could be either in-house or 3rd party) operating within the prescribed architecture and roadmap of the company
- Responsible for leading the development, testing and issue resolution and launch of the product/ releases
- Drive adoption of the solution and value realization with the multifunctional colleagues
- Transitioning the solution into steady state mode at the right time
Next Step
Prepare your updated resume (include your current salary package with dull breakdown such as base, incentives, annual wage supplement, etc.) and expected package.
Email your updated resume to technicalstaffing@adecco.com. All shortlisted candidates will be contacted
No comments:
Post a Comment
Please give your feedback or Job Request here