Lo sentimos, la oferta no está disponible,
pero puedes realizar una nueva búsqueda o explorar ofertas similares:

Analistadevops Senior

En Ketos Delphin estamos en búsqueda de «Analista Devops Senior» para compañía especializada en intermediación bursátil, administración de carteras, fondos c...


Desde Ketos Delphin S.A - Capital Federal

Publicado a month ago

Scraping Developer

Remote position (only for professionals based in Latam) RYZ is looking for Engineers who have experience doing scraping at scale. Basic Qualifications: - 3+ ...


Desde Ryz Labs - Capital Federal

Publicado a month ago

Devops Ssr

Nos encontramos en búsqueda de un DevOpspara nuestro cliente: una importante empresa multinacional de industria Tabacalera. Modalidad híbrida 3×2 (zona Olivo...


Desde Ketos Delphin S.A - Capital Federal

Publicado a month ago

Advanced Maintenance & Support Engineer Buenos Aires, Argentina And 1 Posted On 08/17/2023 Be T[...]

Resumen del puesto: Proporciona asistencia técnica en las aplicaciones de software de DN. Ofrece asistencia a nivel empresarial a los clientes mediante el an...


Desde Diebold Inc. - Capital Federal

Publicado a month ago

Consultor Big Data

Consultor Big Data
Empresa:

Giannazzo & Asoc.


Detalles de la oferta

Se busca consultor en ERP para trabajar en Puerto Madero, en forma presencial de 9 a 18 hs. Se ofrecerelación de dependencia, obra social y otros beneficios.Hadoop Developer Roles and Responsibilities:The following are the tasks a Hadoop Developer is responsible for:

Hadoop development and implementation.

Loading from disparate data sets.

Pre-processing using Hadoop ecosystem tools.

Translate complex functional and technical requirements into detailed design

Design, develop, document and architect Hadoop applications

Perform analysis of vast data stores and uncover insights.

Maintain security and data privacy.

Create scalable and high-performance web services for data tracking.

High-speed querying.

Propose best practices/standards.

Hadoop Developer Work Routine and Skills:

Loading data from different datasets and deciding on which file format is efficient for a task

Ability to work with huge volumes of data so as to derive Business Intelligence

Apply different HDFS formats and structure like Parquet, Avro, etc. to speed up analytics

Analyze data, uncover information, derive insights and propose data-driven strategies

A knowledge of OOP languages like Java, C++, Python is good to have

Writing high-performance, reliable and maintainable code

Familiarity with data loading tools like Flume, Sqoop

Knowledge of workflow/schedulers like Oozie

Analytical and problem solving skills, applied to Big Data domain

Proven understanding with Hadoop, HBase, Hive, Pig, and Hbase

Good aptitude in multi-threading and concurrency concepts

Knowledge of agile methodology for delivering software solutions

Work with Hadoop Log files to manage and monitor it

Develop MapReduce coding that works seamlessly on Hadoop clusters

Working knowledge of SQL, NoSQL, data warehousing & DBA

Expertise in newer concepts like Apache Spark and Scala programming

Complete knowledge of the Hadoop ecosystem

Seamlessly convert hard-to-grasp technical requirements into outstanding designs


Fuente: Jobs4It

Requisitos


Conocimientos:
Consultor Big Data
Empresa:

Giannazzo & Asoc.


Built at: 2024-04-29T11:35:42.966Z