Lo sentimos, la oferta no está disponible,
pero puedes realizar una nueva búsqueda o explorar ofertas similares:

Senior Full Stack Engineer, Marketplaze Zone

CookUnity is a chef-to-you marketplace connecting talented chefs with consumers, revolutionizing meal delivery. We bring small-batch, restaurant-quality meal...


Desde Borderless Capital - Capital Federal

Publicado a month ago

Web Developer

RYZ is looking for a skilled Web Developer to join our team. The ideal candidate will have a strong understanding of web design principles and the ability to...


Desde Ryzlabs - Capital Federal

Publicado a month ago

Scrum Master Irc216418

Work Model: Hybrid Description:Coordinate client Project, in charge of Jira management, in charge of leading all the meetings, reports and metrics. Requirem...


Desde Globallogic - Capital Federal

Publicado a month ago

Software Engineer Iii - Java

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level.  As a Software Engineer III at JPMorgan Cha...


Desde Jpmorgan Chase & Co. - Capital Federal

Publicado a month ago

Consultor Big Data

Consultor Big Data
Empresa:

Giannazzo & Asoc.


Detalles de la oferta

Se busca consultor en ERP para trabajar en Puerto Madero, en forma presencial de 9 a 18 hs. Se ofrecerelación de dependencia, obra social y otros beneficios.Hadoop Developer Roles and Responsibilities:The following are the tasks a Hadoop Developer is responsible for:

Hadoop development and implementation.

Loading from disparate data sets.

Pre-processing using Hadoop ecosystem tools.

Translate complex functional and technical requirements into detailed design

Design, develop, document and architect Hadoop applications

Perform analysis of vast data stores and uncover insights.

Maintain security and data privacy.

Create scalable and high-performance web services for data tracking.

High-speed querying.

Propose best practices/standards.

Hadoop Developer Work Routine and Skills:

Loading data from different datasets and deciding on which file format is efficient for a task

Ability to work with huge volumes of data so as to derive Business Intelligence

Apply different HDFS formats and structure like Parquet, Avro, etc. to speed up analytics

Analyze data, uncover information, derive insights and propose data-driven strategies

A knowledge of OOP languages like Java, C++, Python is good to have

Writing high-performance, reliable and maintainable code

Familiarity with data loading tools like Flume, Sqoop

Knowledge of workflow/schedulers like Oozie

Analytical and problem solving skills, applied to Big Data domain

Proven understanding with Hadoop, HBase, Hive, Pig, and Hbase

Good aptitude in multi-threading and concurrency concepts

Knowledge of agile methodology for delivering software solutions

Work with Hadoop Log files to manage and monitor it

Develop MapReduce coding that works seamlessly on Hadoop clusters

Working knowledge of SQL, NoSQL, data warehousing & DBA

Expertise in newer concepts like Apache Spark and Scala programming

Complete knowledge of the Hadoop ecosystem

Seamlessly convert hard-to-grasp technical requirements into outstanding designs


Fuente: Jobs4It

Requisitos


Conocimientos:
Consultor Big Data
Empresa:

Giannazzo & Asoc.


Built at: 2024-04-24T02:25:06.034Z