Please send me the profiles at deepak.gulia@simplion.com
Position : ETL Developer
Location : Madison, WI
Duration : Long Term
Job Description
- Design ETL solutions to process/load source systems data into multiple database
- 3+ years hands on experience with dimensional and operational data modeling
- 5+ years hands on lead experience architecting and designing ETL components
- Strong skills with understanding the tradeoffs of various design options on multiple platforms
- Experience with preparing technical artifacts to support design efforts
- Ensures adherence to design/development standards
- Ability to Lead a team of technical people which would include on/offshore vendor partners
- Good communicator and proactive approach to daily activities
- Positive and problem solving attitude
- Experience in Agile technical project data delivery
- Ability to build collaborative relationships across the organization
- Ability to plan, orchestrate, and coordinate technical release activities
- Very strong analytical complex thinking
- Ability to communicate back to non-technical stakeholders in a easy to understand manner
Additional Job Requirements
- Experience using the technologies in the Hadoop ecosystem such as: Hadoop, HDFS, Spark, MapReduce, Pig, Hive, Flume, Sqoop, HD Insight, Zookeeper, Oozie, Hue, Kafka Experience in high availability configurations, Hadoop cluster connectivity and tuning and Hadoop security configurations
- Good understanding of Operating Systems (Unix/Linux), Networks and System Administration experience
- Experience with cloud data technologies (Azure, AWS or Google)
- Experience with Spark Streaming, Spark SQL, Pyspark and SparkR is important
- Good understanding of Spark's RDD API
- Good understanding of Spark's Dataframe and API
- Experience and good understanding of Apache Spark Data sources API
- Knowledge of PowerShell scripting is needed to instantiate a cluster
- Experience in Implementation of Hive data warehouse is important
- Experience in one of any messaging queue tools: ActiveMQ, RabbitMQ, Kafka is nice to have
- Experience with various data structures such as ORC, Parque or Avro
- Experience with Azure Microsoft technologies such as: Data factory, Polybase, SQL Data warehouse is desired
- Experience in implementing any MPP platform such as Teradata, Neteeza is desired
- Experience in configuring Spark cluster (schedulers,queues) to allocate memory efficiently
- Experience working with data science team in implementing predictive models is nice to have
E- Mail is the best way to reach me
Deepak Gulia | Simplion – cloud. made simple
Direct: 000000000| Fax: 408-935-8696 | Email: deepak.gulia@simplion.com
GTALK :- deepakgulia.rgtalent@gmail.com
https://in.linkedin.com/in/deepak-gulia-308a2b9b
INC 500 | 5000 Honoree – 2013, 2012, 2011, 2010, 2009
Fast Private Companies award by Silicon Valley Business Journal – 2012, 2011, 2010, 2009, 2008
Best Places to Work in Bay Area by San Francisco Business Times – 2013, 2012, 2011, 2009
Minority Business Enterprise certified by NMSDC
We are an E-Verified Company
You received this message because you are subscribed to the Google Groups "US Jobs: Requirements, Clients and Consultants" group.
To unsubscribe from this group and stop receiving emails from it, send an email to recruiters-r-us+unsubscribe@googlegroups.com.
To post to this group, send email to recruiters-r-us@googlegroups.com.
Visit this group at https://groups.google.com/group/recruiters-r-us.
For more options, visit https://groups.google.com/d/optout.
No comments:
Post a Comment