🚀 𝗪𝗲'𝗿𝗲 𝗛𝗶𝗿𝗶𝗻𝗴! 🚀 ApTask is seeking an experienced AWS Python Data Engineer with Lambda API experience for a hybrid position in Reston, VA. If you have 10+ years of experience and strong skills in Python, Pandas, NumPy, and Pyspark, we want to hear from you! 𝗝𝗼𝗯 𝗗𝗲𝘀𝗰𝗿𝗶𝗽𝘁𝗶𝗼𝗻: - Extensive Python development experience - Expertise in AWS services (S3, RDS, EC2, Lambda, SQS, SNS, Redshift) - Prior experience at Fannie Mae is a plus - Proficient in Java and Database (Oracle, Postgres) - API and Lambda experience 📧 𝗦𝗲𝗻𝗱 𝘆𝗼𝘂𝗿 𝗿𝗲𝘀𝘂𝗺𝗲 𝘁𝗼 𝗮𝗯𝗵𝗶𝘀𝗵𝗲𝗸𝗸@𝗮𝗽𝘁𝗮𝘀𝗸.𝗰𝗼𝗺 #Hiring #DataEngineer #AWS #PythonDeveloper #Lambda #API #Pandas #NumPy #Pyspark #Java #Database #RestonVA #TechJobs #CareerOpportunity #FannieMae #TechHiring #JobOpening #HybridJobs
ApTask’s Post
More Relevant Posts
-
Senior Technical Recruiter, Hiring for AI/MLEngineer, AIBusinessAnalyt, AIRiskAnalyst, AISystemAnalyst
#hiringalert #W2_and_C2C_Acceptable #opentowork #activelylooking #Business_Systems_Analyst #Mountain_View_CA (#Remote ok – should work in PST) #Required_Skills : 5 years of work experience involving #quantitative_data_analysis Advanced #SQL skills to get the data you need from a #datawarehouse and perform #data_segmentation and #aggregation from scratch. #Data_query and data processing tools/systems (e.g., relational, NoSQL, stream processing) Familiarity with #AWS (#Redshift, #Athena, and AWS core concepts) Familiarity with #Data_modeling and #schema_design Proficient in analytical and #datamodeling tools, such as #Python, #R, or #PyCharm Please share the resume at sachin@amaglobaltech.com
To view or add a comment, sign in
-
Certified - SQL Dev ® - OCI® Server Integration || Azure DevOps || Python || MySQL || MongoDB || RESTful APIs || AWS || Azure || OCI
Hi folks, we are #hiring for multiple positions at #NTTDATA. If anyone is interested, please send your resume to this email address: swamy.j@tekwissen.in Sr Data Engineer 5+ years experience Good Understanding of scala and spark Key skill: Scala and spark • Strong demonstrable experience in system solutions design ( Coming from a development background )andhands-on with Java/J2EE or python • Strong Experience with configuration tools like Openshift, Kubernetes, DockerCoding & Trouble Shooting Exp • Strong experience with Setup Load Balancer,Mutual Auth setup, Nginx configuration and SSL/experience with CI tools ( Jenkins,TeamCity ) and Build tools ( Maven,Gradle,SBT #OpenToWork #JobSeeker #Hiring #JobSearch #LookingForWork #NewJob #JobHunt #CareerOpportunity #Employment #Resume #JobSeeking #Opportunity #ReadyToWork #CareerSearch #NowHiring #JobMarket #AvailableForWork #JobWanted #JobOpening #WorkSearch #DataEngineer #DataEngineering #BigData #ETL #DataWarehousing #DataProcessing #DataIntegration #SQL #NoSQL #DataPipeline #DataAnalytics #DataScience #CloudComputing #Python #Hadoop #Spark #DataTransformation #Database #DataManagement #ETLJobs #DataJobs #TechJobs#Scala #FunctionalProgramming #Programming #DistributedComputing #ScalaLanguage #ScalaDevelopment #TypeSafe #ScalaCommunity #Akka #PlayFramework #ScalaEngineer #ScalaJobs #ScalaProgramming #ConcurrentProgramming #FunctionalLanguages #ScalaDeveloper #ScalaCoding #ScalaSkills #JVM #ReactiveProgramming #ScalaEcosystem#Spark #ApacheSpark #BigData #DataProcessing #DataAnalytics #DataScience #DistributedComputing #SparkFramework #SparkCluster #SparkJobs #SparkProgramming #SparkDevelopers #Scala #Python #MachineLearning #DataEngineering #ETL #Hadoop #DataTransformation #DataLake #SparkSQL #SparkML #BigDataAnalytics
To view or add a comment, sign in
-
Modern Data Engineering RoadMap - 2024 Are you passionate about pursuing a Data Engineer career? Here is a well-detailed roadmap for becoming a data engineer in 2024. Stage 1: Mastering Data Engineering Fundamentals. Here, start by developing an in-depth understanding of what data engineering entails and establish a robust programming foundation. You can learn SQL and any of the following programming languages: Python, Scala, C++, or Java. Stage 2: Build hands-on experience in cloud computing Gain practical experience with leading cloud platforms such as AWS, Azure, or GCP. Learn to provision resources, manage storage, and deploy applications in a cloud environment. Stage 3: Explore distributed computing frameworks Gain knowledge in Apache Hadoop, Apache Kafka, Apache Flink, and Apache Spark. Understand their architecture and how they enable the processing of large datasets across clusters. Stage 4: Data warehouses and stream data processing Develop your skills in batch and streaming data processing. Start using tools like Apache Hive or Amazon Redshift for efficient data analysis. Stage 5: Dive into testing NoSQL databases and workflow orchestration tools. Explore NoSQL databases like MongoDB or Cassandra and learn best practices for testing and ensuring data integrity. Master workflow orchestration tools such as Apache Airflow and Prefect. Understand how to design, schedule, and monitor complex data workflows. For more career consulting follow D Global Consultants. #onlinecourses #dataengineering #dataengineerjobs #dataengineers #dataengineer
To view or add a comment, sign in
-
Dear Network, We are seeking #PythonDeveloper #Neo4J #Kafka #DataIntegration #CDC #CI/CD #SoftwareEngineering #GraphDatabase #DataEngineering #HartfordJobs #ConnecticutTech #TechJobs #PythonJobs #SoftwareDevelopment #DataProcessing #JobOpening #HiringNow #DeveloperJobs #TechCareer #Programming #SoftwareTesting #UnitTests #Documentation #AgileMethodology
Dear Network, RSN GINFO SOLUTIONS is seeking: Job Title: Python Developer - Neo4J Graph Database Integration Location: Hartford, Connecticut We are seeking an exceptional Python Developer, who plays a crucial role in developing an application to integrate Neo4J graph database with Kafka topics. Responsibilities: Develop Python application to seamlessly integrate Neo4J graph database with Kafka topic. Design and implement efficient data loading mechanisms to handle large volumes of transactions using change data capture (CDC) techniques. Collaborate with cross-functional teams including data engineers, data scientists, and software developers to ensure smooth integration and optimal performance. Write clear, maintainable, and well-documented code following best practices. Develop comprehensive unit tests to ensure the reliability and robustness of the application. Create detailed technical documentation to facilitate ease of understanding and future maintenance. Implement continuous integration and continuous delivery (CI/CD) pipelines for the Python application. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience (5+ years) working as a Python Developer, preferably in a data-intensive environment. Strong proficiency in Python programming language and experience with Python frameworks (e.g., Flask, Django). In-depth understanding of Neo4J graph database and experience with graph data modeling. Hands-on experience with Apache Kafka and knowledge of Kafka Connect for data integration. Familiarity with change data capture (CDC) techniques and real-time data processing. Solid understanding of software development lifecycle (SDLC) and agile methodologies. Experience with writing unit tests using testing frameworks such as pytest. Excellent communication skills and ability to collaborate effectively with cross-functional teams. Prior experience implementing CI/CD pipelines using tools like Jenkins, GitLab CI, or similar. Preferred Qualifications: Master's degree in Computer Science or a related field. Experience working with cloud platforms such as AWS, Azure, or Google Cloud Platform. Knowledge of containerization technologies like Docker and orchestration tools like Kubernetes. Understanding of streaming data processing frameworks such as Apache Flink or Apache Spark Streaming. Familiarity with data visualization tools such as D3.js or Plotly. #PythonDeveloper #Neo4J #Kafka #DataIntegration #CDC #CI/CD #SoftwareEngineering #GraphDatabase #DataEngineering #HartfordJobs #ConnecticutTech #TechJobs #PythonJobs #SoftwareDevelopment #DataProcessing #JobOpening #HiringNow #DeveloperJobs #TechCareer #Programming #SoftwareTesting #UnitTests #Documentation #AgileMethodology #USjob
To view or add a comment, sign in
-
I’m #hiring. Know anyone who might be interested? #applyhere : ngowtham@cliecon.com #clieconsolutions #bigdata #scala #python #machinelearning #hadoop #bigdataengineer #bigdatadeveloper #w2 #w2contract #jobposting #jobpost #innovative #innovation #linkedin #linkedincommunity #love #career #computerengineer #computerscience #computersciences
To view or add a comment, sign in
-
Discover our new #JobOpportunities in the #DataEngineering sector. 🔍 Our clients, with the support of Open Search Group, are looking for 14 different positions in Data Engineering Field. Learn more about the positions and skills required in the following carousel ⤵️ Are you interested in one or more positions? Write to hr@opensearchgroup.com attaching your CV and as the subject of the email type "Spontaneous application Data Engineering" 📧 #Python #Hadoop #Linux #Java #Springboot #MongoDB #ETL #AWS #Azure #Pandas #Git #Terraform #Jenkins #Spark #Kafka #GoogleCloud #AWS #Azure #Recruitment #OS_net #Headhunting #JobOpportunity
To view or add a comment, sign in
-
Discover our new #JobOpportunities in the #DataEngineering sector. 🔍 Our clients, with the support of Open Search Group, are looking for 14 different positions in Data Engineering Field. Learn more about the positions and skills required in the following carousel ⤵️ Are you interested in one or more positions? Write to hr@opensearchgroup.com attaching your CV and as the subject of the email type "Spontaneous application Data Engineering" 📧 #Python #Hadoop #Linux #Java #Springboot #MongoDB #ETL #AWS #Azure #Pandas #Git #Terraform #Jenkins #Spark #Kafka #GoogleCloud #AWS #Azure #Recruitment #OS_net #Headhunting #JobOpportunity
To view or add a comment, sign in
-
Discover our new #JobOpportunities in the #DataEngineering sector. 🔍 Our clients, with the support of Open Search Group, are looking for 12 different positions in Data Engineering Field. Learn more about the positions and skills required in the following carousel ⤵️ Are you interested in one or more positions? Write to hr@opensearchgroup.com attaching your CV and as the subject of the email type "Spontaneous application Data Engineering" 📧 #Python #Hadoop #Linux #Java #Springboot #MongoDB #ETL #AWS #Azure #Pandas #Git #Terraform #Jenkins #Spark #Kafka #GoogleCloud #AWS #Azure #Recruitment #OS_net #Headhunting #JobOpportunity
To view or add a comment, sign in
-
Discover our new #JobOpportunities in the #DataEngineering sector. 🔍 Our clients, with the support of Open Search Group, are looking for 10 different positions in Data Engineering Field. Learn more about the positions and skills required in the following carousel ⤵️ Are you interested in one or more positions? Write to hr@opensearchgroup.com attaching your CV and as the subject of the email type "Spontaneous application Data Engineering" 📧 #Python #Hadoop #Linux #Java #Springboot #MongoDB #ETL #AWS #Azure #Pandas #Git #Terraform #Jenkins #Spark #Kafka #GoogleCloud #AWS #Azure #Recruitment #OS_net #Headhunting #JobOpportunity
To view or add a comment, sign in