Apache Spark Application Developer
Role details
Job location
Tech stack
Requirements
Project Role :Application DeveloperProject Role Description :Design, build and configure applications to meet business process and application requirements.Management Level :10Work Experience :4-6 yearsWork location :KolkataMust Have Skills :Good To Have Skills :Job Requirements : Key Responsibilities : A: Will be responsible for various transformations and actions, spark configuration and tuning techniques B:Must have the Ability to work on Hadoop architecture; execution engines, frameworks, applications tools C:Should be able to work on Pyspark using Spark MLlib library D:Must be able to deliver Data warehousing concepts methodsTechnical Experience : A:Should have 4-5 years of experience using PySpark with Spark RDDs Spark SQL DataFrames B: Should have 2-3 years of experience in AWS Sagemaker and AWS Glue C: Should have 3-4 years of experience in data wrangling and data analysis with Pandas and Numpy D:Should have 3-4 years of Working experience with ML algorithms like Random Forest, Linear Regression, Logistic Regression, Decision Trees, K Means etcProfessional Attributes : A: Should have good communication and analytical skillsEducational Qualification : A: Mandatory 15-year fulltime graduationAdditional Information : A : No Specific Shift Timings 15 years of full time education ]]>