Tuesday 5 February 2019

IT Developers Vacancies in Techwaves, Netherlands

Image result for Netherlands name

Techwaves client focus and collaborative approach have resulted in it being ranked #9 amongst fastest growing companies in the region by the Philadelphia Business Journal and #15 in IT Service domain in USA by Inc. 500. Techwave has more than 280 consultants in over 6 countries. Techwave takes IT and Business best practices and personalizes them for their clients to provide mature and effective IT outcomes for business

SPLUNK DEVELOPER

Experience with Splunk architecture and implementation
Success creating both operational and executive dashboards using Splunk
Success writing and tuning custom queries
Experience using Splunk Enterprise Security.
Familiarity with Splunk's new User Behaviour Analytics (UBA) and Security Orchestration and Automation (SOAR) offerings
Ability to translate business use-cases into operational dashboards and queries
Ability to identify gaps in the visibility provided by the logs collected to make recommendations for additional logging
Experience performing health checks for Splunk instances to make sure licensing is not breached and logs are being collected successfully
Additional security related certifications and experiences are advantageous but not required.
Responding to Customer tickets on dashboards, alerts and reports
Understanding Customer middleware team (business/ support) team requirements
Updating code for dashboards/ alerts/ reports in customer application 

Required Skill Set:

3-7 Years of experience
Good communication Oral & written
Willingness to travel Short & Long term
Splunk SPL (Able to understand using commands, and syntaxes).
Able to manage user interface using XML and HTML coding knowledge is required.
Good knowledge on Splunk knowledge objects (Alerts, Report, Event types etc.,).
Good Analytical skills.
Dashboard source code version management skills


HADOOP DEVELOPER

Experience : 6-12 yrs

Responsibilities:

Hadoop development and implementation
Loading from disparate data sets
Pre-processing using Hive and Pig
Designing, building, installing, configuring and supporting Hadoop
Translate complex functional and technical requirements into detailed design
Perform analysis of vast data stores and uncover insights
Maintain security and data privacy
Create scalable and high-performance web services for data tracking
High-speed querying
Managing and deploying HBase
Being a part of a POC effort to help build new Hadoop clusters
Test prototypes and oversee handover to operational teams
Propose best practices/ standards

Required Skill Set:

Knowledge in Hadoop
Good knowledge in back-end programming, specifically Java
Writing high-performance, reliable and maintainable code
Ability to write MapReduce jobs
Good knowledge of database structures, theories, principles, and practices
Ability to write Pig Latin scripts
Hands on experience in HiveQL
Familiarity with data loading tools like Flume, Sqoop
Knowledge of workflow/ schedulers like Oozie
Analytical and problem-solving skills applied to Big Data domain
Proven understanding with Hadoop, Hive, Pig, and HBase
Good aptitude in multi-threading and concurrency concepts


Sri Ganesh Gunnam
9182121295
svinay@techwave.net, smastan@techwave.net
http://techwave.net/

No comments:

Post a Comment

Please give your feedback or Job Request here