BIGDATA USA/EU         INTERNETOFTHINGS JOBS         Home         Register       Sign In


 
Company Info
INTUIT
PLANO, TX, United States

Company Profile


Web Analytics


col-narrow-left   

Job ID:

10539

Location:

Bangaluru, IN-KA, India 

Category:

Big Data Analytics

Salary:

per week
col-narrow-right   

Job Views:

145

Employment Type:

Full time

Posted:

12.07.2017
col-wide   

Job Description:

Come join Intuit as a Lead Data Architect for our ProTax Group (PTG) analytics group. We are looking for strategic and creative problem solver, who has deep experience in various Big Data Technologies to assist in the design and development of PTG’s analytical and reporting data infrastructure.  Our vision is two-fold: First, to organize, manage and operationalize all of PTG”s data sets which include product, marketing, web, vendor, sales, customer care, and external benchmarking data in an intuitive way.  Second, is to simplify data access to be able to support predictive analytics, behavior segmentation, and operational reporting just to name a few of our upcoming objectives for the analyst community.  Can you provide data as a service to unleash our analyst horsepower to drive our business forward?  If so, this is the perfect opportunity, to join a talented group of engineers in a start-up like environment, working with cutting-edge technologies on a greenfield project.

Intuit's PTG data landscape is ripe for disruption with us moving towards predictive and behavior analytics to better understand our own customers.  Our infrastructure is also moving towards IAC (Intuit Analytics Cloud), our Hadoop platform for data storage and analytics.  We want to be able to minimize the amount of time spent for data collection and data preparation for all of PTG data consumers.   PTG is now focused on building a holistic universal view of the customer including all their touch points with PTG processes and product.  The emerging shift is also to be able to minimize the cost of our customer acquisition, retention, servicing and to delight them by eliminating pain points across the board.  You will have complete autonomy and ownership to develop, enhance, and maintain our Warehouse, ETL, and Reporting infrastructure.  Understanding business needs and translating data requirements will be critical to this role.  This position will lead PTG’s initiative to build a long term and short term data architecture and solve for both our operational reporting needs and analytics needs.  This is a very hands-on position, and you must be willing to dig in and get your hands dirty. 

Job Requirements:

  • 5-10 years of relevant experience

  • A solid background in the fundamentals of computer science, large scale data processing as well as mastery of database designs and data warehousing

  • Required experience with Hadoop, Oracle, Business Objects, Netezza, SAS, Tableau, Informatica, Salesforce.

  • Knowledge of common ETL packages/libraries, common analytical/statistical libraries and common graphical display packages

  • Demonstrated leadership ability, with strong communication, collaboration, influencing and writing skills

  • Understanding and delivering on customers’ highest priority needs using rigorous processes for collecting, analyzing and prioritizing customer pain points and needs

  • Demonstrated ability to think end-to-end, manage long-term projects and manage multiple tasks simultaneously, and deliver on outcomes / achieve results

  • Team player, with ability to collaborate to drive to results

  • Bias for action. High energy, “can do” style that thrives in a fast-paced environment

  • Have expert knowledge and hands on experience with the Apache Hadoop stack, including Hadoop, HDFS, Oozie, Pig and Hive.

  • Leadership skills with hands on data architecture and development experience

  • Knowledge of ETL, data profiling, data archiving data synchronization, data migration, data integration and data storage design patterns

  • Experience with documenting data models using UML and/or ER diagram types.

  • Solid communication and presentation skills

  • Source system analysis and validation of pre-Hadoop processing to ensure that the data is getting processed without any errors

  • Shell scripting, hive and Netezza nzload load utility to load data from HDFS to Netezza database tables.

  • Deployment of files in production and daily production job monitoring.




Home My Account Find Jobs Post Resumes Search Resumes Post Jobs Contact About Us Sitemap terms & cond Privacy policy