BIGDATA USA/EU         INTERNETOFTHINGS JOBS         Home         Register       Sign In

Company Info


Company Profile



Job ID:



Hyderabad, IN-TS, India 


Big Data Analytics, Big Data Appliance Skills, Big Data Architect, Hadoop

Job Views:


Employment Type:

Full time



Job Description:

Cloudwick is the leader in Big Data People, Process and Technology Transformation. Our Big Data Administrators, Developers and Engineers work on today's most exciting Fortune 1000 Big Data transformations at companies like Visa, Bank of America, JP Morgan, American Express and Network Appliance to name a few. With Cloudwick's strong technology partnerships with Cloudera, Hortonworks, DataStax and other Big Data technology providers we're growing at more than 20% quarter over quarter. We're looking for Computer Science graduates who want to become Big Data Hadoop Administrators, Developers and Engineers

Job Requirements:

  • Cloudwick maintains its own Big Data lab where you will be trained on real-world use cases using Cloudera, Hortonworks, MapR and DataStax.
  • Core Big Data platforms that Cloudwick will train and certify you on include Cloudera, Hortonworks, MapR and DataStax.
  • Core Hadoop Management and Monitoring that Cloudwick will train you on includes: Cloudera Manager, Ambari, Ganglia and Nagios.
  • Core BI & Visualization platforms you will work on include Datameer, Microstrategy, Pentaho. Talend and Tableau.
  • Core NoSQL technology that Cloudwick will train and certify you on include HBase and DataStax's Cassandra.
  • Other Core technologies that Cloudwick will train and certify you on include MapReduce, YARN, ETL Storm, Kafka, Puppet, Chef, Amazon, Rackspace and Openstack.
  • You will be trained to be a Big Data Expert!
  • You will work on interesting Big Data projects at Fortune 1000.
  • Cloudwick will invest in your future and its success!
  • Responsible for implementation and ongoing administration of Hadoop infrastructure at Cloudwick's Fortune 1000 clients.
  • Cluster maintenance as well as creation and removal of nodes.
  • HDFS support and maintenance.
  • Backup and restores
  • Cluster Monitoring and troubleshooting
  • Manage and review Hadoop log files
  • File system management and monitoring.
  • Design, implement and maintain security
  • Data capacity and node forecasting and planning.
  • Closely working with the infrastructure, network, database, application, and business intelligence teams to ensure data quality and availability.
  • Work with application teams to install operating system and Hadoop updates, patches, version upgrades as required.

Home My Account Find Jobs Post Resumes Search Resumes Post Jobs Contact About Us Sitemap terms & cond Privacy policy