Lo sentimos, la oferta no está disponible,
pero puedes realizar una nueva búsqueda o explorar ofertas similares:

(Fluent German) Technical Support Consultant (Buenos Aires, Remotely)

Passionate about the world of tech? What if you had a chance to be a part of the world's leading SaaS, Software, or Hardware solutions? Join our team as Tech...


Desde Supportyourapp - Capital Federal

Publicado a month ago

Software Engineer Ii

ABOUT THE OPPORTUNITYAvaya is a global company, with teams around the globe and we deliver solutions worldwide with virtual teams across multiple countries a...


Desde Avaya - Capital Federal

Publicado 25 days ago

Software Solutions Architect

We are Tekton Labs, a technology and software development consulting company, founded in Silicon Valley and with offices in Peru, Mexico, and the United Stat...


Desde Tekton Labs - Capital Federal

Publicado 22 days ago

Jr. Data Analyst & Reporting (Us Client - Olivos/Barracas)

Description Line of ServiceInternal Firm ServicesIndustry/SectorNot ApplicableSpecialismIFS - Internal Firm Services - OtherManagement LevelSpecialistJob Des...


Desde Pwc Argentina - Capital Federal

Publicado 16 days ago

Consultor Big Data

Consultor Big Data
Empresa:

Giannazzo & Asoc.


Detalles de la oferta

Se busca consultor en ERP para trabajar en Puerto Madero, en forma presencial de 9 a 18 hs. Se ofrecerelación de dependencia, obra social y otros beneficios.Hadoop Developer Roles and Responsibilities:The following are the tasks a Hadoop Developer is responsible for:

Hadoop development and implementation.

Loading from disparate data sets.

Pre-processing using Hadoop ecosystem tools.

Translate complex functional and technical requirements into detailed design

Design, develop, document and architect Hadoop applications

Perform analysis of vast data stores and uncover insights.

Maintain security and data privacy.

Create scalable and high-performance web services for data tracking.

High-speed querying.

Propose best practices/standards.

Hadoop Developer Work Routine and Skills:

Loading data from different datasets and deciding on which file format is efficient for a task

Ability to work with huge volumes of data so as to derive Business Intelligence

Apply different HDFS formats and structure like Parquet, Avro, etc. to speed up analytics

Analyze data, uncover information, derive insights and propose data-driven strategies

A knowledge of OOP languages like Java, C++, Python is good to have

Writing high-performance, reliable and maintainable code

Familiarity with data loading tools like Flume, Sqoop

Knowledge of workflow/schedulers like Oozie

Analytical and problem solving skills, applied to Big Data domain

Proven understanding with Hadoop, HBase, Hive, Pig, and Hbase

Good aptitude in multi-threading and concurrency concepts

Knowledge of agile methodology for delivering software solutions

Work with Hadoop Log files to manage and monitor it

Develop MapReduce coding that works seamlessly on Hadoop clusters

Working knowledge of SQL, NoSQL, data warehousing & DBA

Expertise in newer concepts like Apache Spark and Scala programming

Complete knowledge of the Hadoop ecosystem

Seamlessly convert hard-to-grasp technical requirements into outstanding designs


Fuente: Jobs4It

Requisitos


Conocimientos:
Consultor Big Data
Empresa:

Giannazzo & Asoc.


Built at: 2024-03-29T12:14:08.298Z