Dieudonné
lin
Dieudonné
J
Hi I am looking for support for my job on GCP, Pyspark and Azure
Rocky
I need offline support Bangalore location Data bricks, Python, pyspark, data factory
☺️
How is the current job openings for Data Engineering ?
AR
Need GCP Data engineer support in hyd... Off-line is preferable if possible Dm
J
Hi I am looking for my job support on GCP , Pyspark (USA)
Ares
Hi I am looking for a data Engineer from Brazil Please contact me if you are interested in
Ares
Azure Cloud Data Engineer I am looking for Azure Cloud Data Engineer from Brazil Key Responsibilities: Design, develop, and maintain scalable cloud-based data solutions utilizing Microsoft Azure. Implement robust data models and data warehouses to support business intelligence and analytics needs. Develop and optimize ETL processes to ensure accurate and efficient data transfer between systems. Collaborate closely with stakeholders to gather and translate data requirements into comprehensive technical specifications. Monitor, troubleshoot, and enhance data pipelines to maximize performance and reliability. Ensure data security and compliance with relevant industry standards and regulations. Provide expert technical guidance and support for various data-driven projects across the organization. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related discipline. Proven experience as a Data Engineer with a strong focus on cloud technologies, specifically Microsoft Azure. In-depth knowledge of data modeling, data warehousing concepts, and industry best practices. Hands-on experience with Azure Data Factory, Azure SQL Database, and other Azure data services. Proficiency in data visualization tools and working with complex workbooks. Preferred experience with Deltek. Strong programming skills, particularly in SQL and Python. Excellent problem-solving abilities with a keen attention to detail. Ability to work effectively in a collaborative team environment and communicate complex technical concepts to both technical and non-technical stakeholders. Preferred Skills: Experience with additional cloud platforms such as AWS or Google Cloud. Knowledge of big data technologies including Hadoop, Spark, or Databricks. Familiarity with DevOps practices and tools. Experience with data governance and data quality frameworks. Please contact me if you are interested in
Jac
Need help in aws data engineer certification
P
Spark interview at bytedance https://medium.com/me/stats/post/5e54b24e2a18 https://medium.com/me/stats/post/5e54b24e2a18
Sh
In install ambari 2.7.8 Error! How fix?
Sh
Ambari 2.7.8 Ubuntu 18 Please give me: Link -> HDP 3
Sh
What is the best alternative to ambari that is both free and looks good and is convenient?
K
https://youtu.be/oiFs2mSP5hk
K
https://youtu.be/QMIcr50ZhlM?si=ra3GZzw8qZibxdX6 | Watch Full Video | Python FUNCTIONS
Freelancer
Need data engineer freelancer for job support Exp- 5+ yrs Skills - hadoop/Big data technologies,Hadoop eco system(HDFS,Map reduce,Hive,Pig,Impala,Spark,Kafka,Kudu,Solr) Python/Pyspark Min 3+ yrs of experience in spark programming
K
https://youtu.be/Kb6BQN6oJqk / WATCH FULL VIDEO / PYTHON LIST TUPLE
Chowdhury
I will continue to thank God every day for the help that @CalebMarking offered me through the online investment business. I invested $300 in his legitimate global and profitable trading platform.  Then I easily got $3500 after trading.  I don't believe it will be possible if I decide to invest my small capital.  Right now I am having an unbearable Joy and a sweet heart too, I recommend his platform to you through the company account manager username.. @CalebMarking
K
😀 Watch Full Video => Python Dictionary vs Set in Detail | with Methods and Interview Questions | / https://youtu.be/YqIe5Cl_1OI
Jagadeesh
I am looking for the snowflake support if anyone there pls let me know
K
Watch in Eng Subs - this Python Video To Clear Advance Function - Watch till end _ https://youtu.be/jQb8KwUb7IY
Code
[Hiring] Hiring a Data Engineer with 4+ years of experience to design, build, and maintain data pipelines and infrastructure, ensuring optimal performance and reliability across systems.
K
LEARN THIS PYTHON FUNCTIONS - TO 10X your code - |MUST WATCH| https://youtu.be/-Zev9Bnxjgs
Deepak
Hello all
Deepak
Deepak
How to achieve in Pyspark
Alexander
How to achieve in Pyspark
df .groupby("id") .agg(collect_list(col("phone_number")).alias("phone_number_list"))
Deepak
Thanks
Nayla
Hello! Any folks here involved in data modeling? I would to get your insights on modeling pain points. DM me for more details
R@haiL
Any one here in Mlops ?
Code
Hiring AI/ML engineer - Full Time - Remote position.
SunGalaxy
I am looking for a Data Engineer with 4+ years of experience to design, build, and maintain data pipelines and infrastructure. It's Paid Remote Role
meyer
looking for data engineers to work remotely for us jobs dmme if interested
meyer
praveen
I am looking for a data engineer who is willing to go hybrid in mclean, va location. client is looking for engineer with spark / python / scala / AWS experience. if interested send resume to info@rishsystemsllc.com
meyer
Industry : Banking Job Title: .NET/UI Developer Location: 100 % Remote  Duration :12-24 months contract Overview of the Project: Application for a modernization effort to monitor, assess, and generate specific data into how their network cells are being swapped. The app offers visually compelling data reports before and after data is captured, as well as monitors the workflows and processing of said cell swap. This app is critical in terms of monitoring and providing documentation to their modernization efforts.
meyer
who can do this?
meyer
any software engineers with network protocol experience ?
Oussema
Any junior offers here ?
Anonymous
I worked as data engineer at Nike & Dell. If anyone needs my help, DM is OK. I can help you
Dk
Hello, i am looking for job support on below tech stack. Ping me if anyone is interested : Python, Data experience, AWS (Serverless), SQL, Git, Jenkins .Must Haves: Need a good mix of web development and data engineering knowledge. Core skills: Python, Data pipelines, AWS(Serverless), SQL, Github and Jenkins. * Mainly backend role with Python.unit test case writing
Code
Is anyone here looking for a job? Please send me a message on LinkedIn. https://www.linkedin.com/in/stefan-stasinopoulos-65b87333a/
Kumar
Need GCP data engineer and Python job support
Techie 009
Job Opportunity: Remote Job Support We are hiring an expert for remote job support (2-3 hours daily, Mon-Fri, from 9 PM IST onwards). Requirements: Experience: 5+ years (overall), 3+ years in GCP and PySpark Proficiency in BigQuery and Apache Airflow Strong problem-solving skills are a must. Please DM for details.
Astha
🌟 We are HIRING! 🌟 Profile: Data Engineer Experience:2.5+years Location: Noida Work mode: hybrid Mandatory Skill: Azure Data bricks + Azure Data Engineer Key Responsibilities: Data Management: Build, maintain, and optimize data pipelines and databases in Snowflake to support sales operations. Data Integration: Integrate data from various sources, ensuring data quality and consistency. Data Modeling: Develop and maintain data models to support reporting and analytics needs. ETL Processes: Design and implement ETL (Extract, Transform, Load) processes to ensure efficient data ow. Performance Tuning: Optimize database performance and query efficiency. Design and Development: Lead the design, development, and implementation of data pipelines and data warehouses. Troubleshooting: Troubleshoot data pipelines and data warehouses to resolve issues promptly. Stakeholder Communication: Communicate with stakeholders to understand their needs and ensure that projects meet their requirements. Collaboration: Work closely with the Operations Teams in HQ to understand data requirements and deliver solutions that meet business needs. Data Security: Ensure data security and compliance with relevant regulations. Documentation: Create and maintain comprehensive documentation for data processes and systems. Qualifications: Experience: 5+ years of experience in data engineering, with a focus on sales data. Technical Skills: Proficiency in Snowake, SQL, and ETL tools. Analytical Abilities: Strong analytical and problem-solving skills. Communication: Excellent verbal and written communication skills in English. Collaboration: Ability to work collaboratively with cross-functional teams. Attention to Detail: High attention to detail and commitment to data accuracy. Adaptability: Ability to thrive in a fast-paced and dynamic work environment. If you're ready to make an impact in Data Engineering, we want to hear from you! Also if you know anyone who would be a great fit, please let us know! You can send your resume at- astha.sisodiya@technokrate.com #DataEngineer#Snowflake #AWS#Mumbai#Pune#Chennai #Hyderabad#Bangalore#kolkatta #Noida#NCR#urgenthiring#immediatejoiner#ETL#SQL#Pune#django#Flask#GCP#Madhyapradesh#Punjab#Chandigarh#haryana#Maharashtra#delhi#softwareengineer#jobopportunities#remotejobs#fulltime#experience#communicationskills#skills#DataEngineer#DataEngineering#DataJobs#DataScienceJobs #BigData#TechJobs#DataAnalytics#ETL#CloudDataEngineer#DataPipeline#DataWarehouse#SQLJobs#PythonDeveloper#MachineLearningEngineer#DataEngineeringJobs#AIJobs#DataOps#Hadoop #SparkJobs#DataArchitect
Sayed
Who studying MDM INFORMATICA here
SunGalaxy
[Hiring] looking for data engineers to work remotely. 📍 Requirements: Position: Remote (Part-time, 1-3 hours/day) Roles: Multiple positions available Age: 23+ English Proficiency: C2+ level required Equipment: Must have a laptop and phone Skills: Basic knowledge (no problem if none) Prove Citizenship: Through verified social media accounts 💼 Salary: $50 - $2,000/month depending on the role A great opportunity to increase your income with minimal effort! please dm me if you are interested.
Scott
Am really interested in this position
Sh
hello friends What is the best looking software to view apache HBase tables?
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
KA
#Hiring #Informatica #IDQ #EDC #AXON #Remote Company Description The company specializes in implementing Informatica tools for data governance, including Axon, EDC, and IDQ. Role Description The role involves leading the end-to-end implementation of Informatica tools for data governance. Responsibilities include user role creation, tool configuration, metadata mapping, data quality rule development, testing, and user training. Proven professional implementation experience is required. Qualifications Expertise in Informatica Axon, EDC, and IDQ. Hands-on experience with metadata management and data quality processes. Strong background in data governance frameworks. Proven experience in professional implementations. Requirements Check Prerequisite Activities: Create user accounts and roles in Axon, EDC, and IDQ. Perform integration sanity checks between tools. Collect data quality rules from stakeholders. Implement Informatica EDC: Catalog data and map technical metadata to business glossary terms. Establish data lineage and conduct impact analysis for 90 KPIs. Build a knowledge graph in EDC. Implement Informatica IDQ: Profile data and create data quality rules for six KPIs. Develop a data quality scorecard. Implement Informatica Axon: Create a business glossary and define key business terms. Map users to business domains and configure Axon-EDC-IDQ integration. Perform Testing and Validation: Conduct unit, integration, and user acceptance testing. Validate data governance processes and resolve any issues. Migrate to Production: Transfer configurations, glossary entries, and rules to the production environment. Perform final system checks. Conduct User Training and Handover: Train users on Axon, EDC, and IDQ functionalities. Provide documentation and post-implementation support. Conduct a final review and obtain sign-off. Timeline very tight send cv to obiee2000@gmail.com
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
Deepak
Hi All, Is Anyone provided databricks cerfication at lower cost? Please free to connect me.
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
c
Hello, anyone has the pdf of fundamentals of data engineering?
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
Alexander
c
I found it :) thanks anyway
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
praveen
Any data engineer profiles? Hybrid role mclean va.. Spark, scala, python, aws Email resume to info@rishsystemsllc.com
Deepak
Hi Does anyone share realtime DQ Checks in Data Pipeline
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me
Nimbus
Hello I need this
Please check you messages dm
Nimbus
How can I get this please
Check messages dm
Nimbus
Azure data engineering full course Aws data engineering full course Latest AUGUST BATCH 2024 Data factory, databricks, pyskark, sqlspark, synapse, basic of python 120 hours of content 4 real time end to end projects 20 modules with real time scenarios Interview preparation kit latest 50 mnc asked interview questions Dm me