การทำเหมืองข้อมูล Jobs
Data Mining is the process of collecting and analysing data sets to identify previously unknown patterns, correlations and trends. By using various techniques from statistics and probability to algorithms, data markets leverage data collection technology to extract interpretable insights from unstructured data. Data Mining also provides predictive analytics capabilities which can be used in a variety of industries such as market research, customer relationship management and risk management.
Data Mining experts with experience in market research, customer relations, analytics and machine learning are in high demand right now. They are able to quickly discover the hidden “gold” in large sets of data and help companies extract richer insights which they can use to make more informed decisions.
Here’s some projects that our expert Data Mining Experts have made real:
- Capturing website data and organising into a spreadsheet
- Gather contact information from a list of companies
- Research/surveys from IT employees from tech startups
- Scraping of data from directories for lead generation
- Downloading and uploading data from NIH sites
- Extracting data from Yelp
Data Mining helps clients unlock their full potential by giving them access to powerful insights which would otherwise be inaccessible. By leveraging Data Mining Experts on Freelancer.com, clients can get their own projects done to suit their own individual needs. We invite you to post your project today on Freelancer.com and hire an expert Data Mining Expert to help you with your needs.
จาก 159,917 รีวิว ลูกค้าให้คะแนน Data Mining Experts 4.9 จาก 5 ดาวจ้าง Data Mining Experts
Data Mining is the process of collecting and analysing data sets to identify previously unknown patterns, correlations and trends. By using various techniques from statistics and probability to algorithms, data markets leverage data collection technology to extract interpretable insights from unstructured data. Data Mining also provides predictive analytics capabilities which can be used in a variety of industries such as market research, customer relationship management and risk management.
Data Mining experts with experience in market research, customer relations, analytics and machine learning are in high demand right now. They are able to quickly discover the hidden “gold” in large sets of data and help companies extract richer insights which they can use to make more informed decisions.
Here’s some projects that our expert Data Mining Experts have made real:
- Capturing website data and organising into a spreadsheet
- Gather contact information from a list of companies
- Research/surveys from IT employees from tech startups
- Scraping of data from directories for lead generation
- Downloading and uploading data from NIH sites
- Extracting data from Yelp
Data Mining helps clients unlock their full potential by giving them access to powerful insights which would otherwise be inaccessible. By leveraging Data Mining Experts on Freelancer.com, clients can get their own projects done to suit their own individual needs. We invite you to post your project today on Freelancer.com and hire an expert Data Mining Expert to help you with your needs.
จาก 159,917 รีวิว ลูกค้าให้คะแนน Data Mining Experts 4.9 จาก 5 ดาวจ้าง Data Mining Experts
I need a Python coder to help with basic data analysis tasks. The data is structured, so experience with CSV and SQL is essential. Key tasks include: - Data cleaning and preprocessing - Data visualization - Statistical analysis The ideal freelancer should have: - Proficiency in Python - Experience with data analysis libraries (e.g., Pandas, NumPy, Matplotlib) - Ability to work with structured data formats like CSV and SQL Please provide examples of similar work.
I have a collection of PDF files whose contents need to be transformed into a tidy, well-structured dataset. The raw text must be extracted, checked for accuracy, stripped of duplicates or garbled characters, and logically reorganized so that the final output is ready for analysis or import into other systems. Because the emphasis is on data cleaning and organization—not simple copy-typing—I’m looking for someone comfortable with techniques such as OCR, regex, or other text-processing tools that help speed up the workflow while still allowing for a careful manual review. Clear naming conventions, consistent field ordering, and documented changes are essential. Deliverables: • A clean, fully organized data file (spreadsheet or CSV) created from the PDFs • A b...
I need a clean, one-time extraction of every registered agent listed on the Kerala RERA portal. The scope is limited to the publicly displayed agent name plus all available contact details—phone numbers, email addresses, and office addresses. No licence-status fields or property listings are required. Any stack is fine—Python (BeautifulSoup, Scrapy, Selenium), Node.js, or a headless browser workflow—as long as it handles pagination, hidden rows, or JavaScript-rendered tables and respects polite scraping practices. Deliverables • An Excel workbook (.xlsx) containing one row per agent and clearly labelled columns for Name, Phone, Email, and Address • Data fully deduplicated, UTF-8 compliant, and free of blank placeholders • A short note on the appro...
I have a dataset made up entirely of categorical variables and I want to understand the hidden relationships inside it. The task is strictly exploratory: I am not asking for predictive modelling, only a deep dive that surfaces meaningful patterns and trends. What I expect from you • Clean the data where needed so that the exploratory work is reliable. • Use suitable techniques for categorical exploration—cross-tabulations, chi-square tests, association rules, clustering on encoded variables, or any other method you feel is insightful. • Present the findings in clear, non-technical language supported by concise visuals (bar charts, heat-maps, mosaic plots or similar). • Provide a short, well-commented notebook or script (Python with Pandas, NumPy, SciPy, s...
I need help extracting and analyzing numeric data. Key Requirements: - Extract numeric data from a specified source (to be confirmed) - Analyze the extracted data for insights Ideal Skills and Experience: - Proficiency in handling data extraction tools - Experience with data analysis - Familiarity with at least one DBMS (MySQL, PostgreSQL, or SQL Server) - Attention to detail and accuracy Please provide your approach and relevant experience. Looking forward to your bids!
I have a sizable dataset that first needs to be cleanly extracted and then thoroughly analyzed. Because the information comes from multiple sources, I will provide a mix of Excel/CSV tables, a small hosted database dump, and several raw text exports. Your role starts with consolidating these inputs into a single, well-structured dataset, handling any duplicates or inconsistencies along the way. Once the data is tidy, I would like meaningful insights that help me understand underlying patterns. I’m especially interested in spotting emerging trends, assembling a clear, reader-friendly report, and—if the data supports it—building a simple predictive model that I can run again in the future. Please outline the statistical or machine-learning techniques you feel are most appr...
This assignment is a hands-on data assessment and evaluation. It exposes you to the various steps that data analysts go through when they receive a dataset. This is an individual assignment. Each student is expected to complete the assignment and submit their own work. Assignment Deliverable The submitted report should include a detailed discussion of the techniques you used to answer each question (see below). Include only relevant graphs and tables along with a thorough interpretation of each. Graphs and/or tables with no explanation/interpretation will not be graded. Hand in your work in a Word or PDF format. Dataset for the Assignment Download the (see attached file) It has a sample size of 70. Assignment Objective The main objective of this assignment is to develop a better under...
We're seeking a data-driven Shopify expert to optimize our high-revenue anime jewelry brand and build a comprehensive loyalty and subscription program. What you'll do: - Build, launch, and optimize a highly engaging customer loyalty and VIP rewards program - Architect and manage a new jewelry subscription tier - Analyze user behavior and Shopify analytics to identify and fix conversion bottlenecks - Manage and oversee the loyalty program, optimizing sign-up rates, reducing churn, and increasing LTV Requirements: - Extensive experience developing and managing loyalty programs for e-commerce - Must have worked with $1M+ revenue e-commerce brands - Proven track record of improving customer lifetime value for Shopify stores - Deep knowledge of implementing and managing Shopify loya...
Acquisition Target Sourcing — Home Service Businesses, Southeastern U.S. About Us Genii Group Inc. is acquiring Baby Boomer-owned home-service businesses across the Southeastern United States. We focus on profitable, operationally sound companies whose founders are approaching retirement without a defined succession plan. We are a platform builder, not a turnaround buyer — we pass on more than we acquire. The Role We are hiring a freelance deal sourcer / skip tracer to sit at the top of our acquisition funnel. Your job is to identify and qualify acquisition targets that fit our buy box, verify owner contact information, and deliver a clean, standardized batch of leads every week. Strong leads advance into our evaluation and due diligence pipeline; weak leads waste everyone...
I'm looking for an expert in SPSS to perform descriptive statistics on a mixed dataset (both categorical and numerical). The data is sourced from existing secondary datasets. Ideal Skills and Experience: - Proficiency in SPSS - Strong background in statistics, particularly descriptive statistics - Experience working with both categorical and numerical data - Familiarity with handling and analyzing secondary data If you have a keen eye for detail and can deliver accurate and insightful statistical summaries, I'd love to hear from you. Please provide examples of similar work done in your bids.
Analista de datos / Business Intelligence Buscamos a un perfil de analista de datos, con capacidad para entender las necesidades del cliente, explorar las fuentes de datos existentes, definir KPIs de negocio a partir de los dos puntos anteriores y validar los resultados obtenidos. En nuestros proyectos, el tratamiento del dato casi siempre tiene algún elemento poco convencional: detectamos patrones con Machine Learning, reconocemos objetos en imágenes, sintetizamos información mediante AI (Inteligencia Artificial) y modelos LLM, ¡e incluso trabajamos audio! No se precisa conocimiento en detalle de todas estas tecnologías, sino entender la implantación de soluciones de datos de principio a fin y pueda coordinarse con otros compañeros para ...
Hi, I need to send a form message to all listings in all states - Here's directory - Data will be the same for all forms
Government portals that publish public records but do not offer bulk-download options. I need an automated solution that can search by number on this page and download each file in its native PDF form. Here is what I am after: • A repeatable scraper—Python capable of searching in specific domain, following pagination, and collecting accessible PDF link. • The script should save the PDFs locally in a clear folder structure (site / year / category). • A simple log or CSV report listing the URL, document title, and download status for every file processed. Acceptance criteria 1. All public records published in the specified date span are present as intact PDFs. 2. The log matches the count of files actually downloaded. Please make sure the code is well c...
Project Title: Large-Scale Course Data Scraping from Udemy, Coursera, and YouTube Project Overview: I am looking for an experienced data scraping specialist who can collect a large dataset of online courses and tutorials from platforms such as Udemy, Coursera, and YouTube. The dataset should include both technical and non-technical tutorials. The objective is to build a structured dataset that can be used for research and analytics purposes. Project Requirements: The freelancer will scrape approximately 200,000 course/tutorial records from the following platforms: - Udemy - Coursera - YouTube The collected dataset should contain the following information for each record: 1. Course / Tutorial Title 2. Course URL / Video URL 3. Platform Name (Udemy / Coursera / YouTube) 4...
Project Title Apartment RWA Database (Bangalore & Gurugram) with Verified Contact Details – Fixed Budget Project Overview We are hiring an experienced data researcher to build a high-quality database of apartment societies (RWAs) in Bangalore and Gurugram, including verified association-level contact details. This is a precision-driven research task, not bulk scraping. Accuracy and verification are critical. Scope of Work Target: Only apartment / gated societies No villas, plots, or commercial properties Cities: Bangalore and Gurugram Required Data Fields Apartment / Society Name Location (Area, City, Pincode) Google Maps Link Approx. Number of Units (if available) Contact Details (Strict Requirement): Society Office Contact Number (President / Secretary / Treasurer – ...
I’m assembling a permission-friendly contact database and need your scraping expertise to do it right. Together we’ll agree on the exact niche, then you’ll hunt down the most relevant company sites and publicly available pages, extract the key details, and hand everything back cleaned, de-duplicated, and ready for outreach. What absolutely must appear for every record is the person’s name, a valid email address, and a phone number. When the site also lists a job title or city, capture those too—they’re useful but not compulsory. I’ll be running random spot checks, so accuracy and a clear audit trail (source URLs, time-stamp, etc.) are essential. Deliverables: • An Excel workbook with separate columns for Name, Email, Phone, Job Title, City...
I have several months of raw sales data that I need turned into clear, actionable insight. The sole focus is to identify meaningful trends—seasonality, product-line performance, regional differences, and any other patterns that stand out—so I can make smarter business decisions. Here is what I will hand over: the complete sales dataset exactly as it sits today. If you are comfortable working in Excel/CSV, databases, or cloud-based tables, great—I can supply the files in whichever of those formats you prefer. Here is what I expect back: • A concise analytical report (PDF or slide deck) highlighting the key trends you discover, complete with charts or visualisations. • A clean, well-commented workbook or script so I can replicate or extend the analysis later...
I need a clean, up-to-date list of life insurance agents’ email addresses gathered through web scraping. The data has to come from three source types that I will specify once we start: social media platforms, well-known industry directories, and individual company websites. Along with every email, please include the agent’s full name so the file is immediately usable for targeted outreach. It's also important that all the emails are Canadian brokers. For delivery, a CSV or Excel file with clearly labeled columns (Name, Email, Source URL) works best. Accuracy is critical—no bounced or role-based addresses—and I will run random spot checks before releasing final approval. If you already have scripts or scraping tools ready for these sites, feel free to use them...
I need to compile a clear, reliable picture of how female freelancers are distributed, what skills they market, typical earnings ranges, and any notable growth patterns across major regions. The job starts with data extraction: pull publicly available information from leading freelancer platforms, professional networking sites, and other open repositories. After cleaning and de-duplicating the records, the next step is to analyse the dataset—producing descriptive statistics, trend plots, and concise written commentary that highlights regional hot-spots, in-demand skill sets, and any gaps or opportunity areas. Please deliver: • A CSV or Excel file containing the raw and cleaned datasets, accompanied by a short data-dictionary. • An analytic report (PDF or slide deck) that...
SOLO HISPANOHABLANTES Necesito apoyo puntual para un trabajo universitario que debo desarrollar en R Studio. Los datos con los que trabajaré son numéricos y el objetivo es extraer información clara mediante análisis descriptivo. El entregable debe incluir: • Medidas de tendencia central (media, mediana, moda). • Medidas de dispersión (varianza, desviación estándar, rango). • Distribuciones de frecuencias con tablas y gráficos básicos en ggplot2 o herramientas nativas de R. También requiero: – Código bien comentado en un script .R para que pueda reproducir cada paso. – Breve informe en PDF o Word explicando los resultados y su interpretación académica. – Ses...
Customer Churn Analysis & Prediction Project Overview Conducted an end-to-end Customer Churn Analysis project to identify patterns and predict customer attrition. Focused on both data exploration (EDA) and machine learning modeling to generate actionable business insights. Aimed to help businesses reduce churn, improve customer retention, and increase profitability. Dataset Details Dataset includes customer demographic, financial, and behavioral features: RowNumber, CustomerId, Surname CreditScore, Geography, Gender Age, Tenure, Balance NumOfProducts, HasCrCard, IsActiveMember EstimatedSalary Target Variable: Exited (Churn Status) Tools & Technologies Python Libraries: pandas, numpy (data processing) matplotlib, seaborn (data visualization) scikit-learn (ML modeling) Advanced Te...
I need a skilled web scraper to gather phone contact information for sales inquiries. This data will be used primarily for leads generation. Key requirements: - Scrape phone contacts from specified sources
Job Title: Build Bond Price Research System Using BigQuery and Machine Learning Job Description: We are seeking an experienced freelancer to develop a Bond Price Research System based on client requirements. Commbined with Machine Learning to enable powerful search, analysis, and insights into bond prices. Project Scope: - Design and build a scalable research platform for searching and analyzing bond price data (government bonds, corporate bonds, etc.). - Store and query large volumes of historical and current bond data in BigQuery. - Integrate Machine Learning models to support price analysis, trend detection, forecasting, and other research features. - Customize the solution according to the client’s specific needs. Key Requirements: 1. Technical Skills (Required): - Strong...
We are seeking a skilled data analyst to assist with the following tasks: - Data cleaning and preprocessing, including handling missing values and duplicates. - Conducting Exploratory Data Analysis (EDA) to identify trends and actionable insights. - Building interactive dashboards and reports using Power BI. - Generating actionable business insights to support decision-making. Preferred Tools/Software: - Excel (advanced formulas, pivot tables) - SQL (data extraction and querying) - Python (Pandas, NumPy, Matplotlib) - Power BI The project is expected to be completed within 2 to 3 weeks, depending on complexity. The fixed budget for this project is ₹20,000, with slight flexibility based on scope and quality.
I’m looking for a fresh 2026 updated dataset of U.S. residential contacts that I can drop straight into my campaigns without worrying about bad emails or dead phone numbers. The file must cover every state and include only individuals aged 45 or older. For each contact I need a full name, a working email address, and a reachable phone number. Approx data count according to the USA demographics should be around 100-120 millions I’ll be running the list through ZeroBounce and similar verification tools, so I expect a true 0 % bounce rate. Please deliver everything as a statewise well-structured CSV so I can import it immediately. Deliverables • One/Multiple CSV containing validated records (name, email, phone, age 45+) for all 50 states • Verification repo...
I need a Python program to automate KYC information updates on the Oracle CCB website. The fields to be updated include: - Identity Information - Address Details - Contact Information The data for KYC updation will be provided via manual entry. Ideal Skills and Experience: - Proficient in Python - Experience with web automation (e.g., using libraries like Selenium) - Familiarity with Oracle CCB - Strong understanding of KYC processes and data handling - Ability to create user-friendly input interfaces for manual data entry Please ensure the solution is secure and reliable.
We're putting together a brand-new database of private homeowners who have a swimming pool on their property and we need your help compiling it from scratch. The scope is truly worldwide, so we're not limiting you to any single country or region; wherever reliable data exists, we want it captured. For every record you collect we require following fields mandatorily. • Full name • Email address • Mobile or landline number • Country • Home address (street, city, state/region, postal code) Please provide the finished list in a clean spreadsheet or CSV that we can sort and filter easily. We will spot-check a sample of the contacts against publicly available sources, so only submit information you have personally confirmed. If you have experience build...
I will give you a list of 10,000 individual names together with the URL of an online directory that includes a built-in search box. Your job is to look up each person, copy just two data points—email address and phone number—and record them neatly in a spreadsheet (Excel or Google Sheets is fine). Because only email and phone are required, no extra fields such as address, company name, or job title are necessary. Consistency matters: please keep one row per person, use separate columns for email and phone, and avoid duplicates or partial entries. I will spot-check accuracy, so gathering the correct details the first time is essential. To help you start smoothly, I will provide a small test batch; once that’s approved you can proceed with the full 10k. If you have tools ...
I need a fresh, well-verified list of 100,000 Hotmail addresses that belong specifically to high-net-worth investors. My focus is on Switzerland, the USA, Ireland, Canada and the UK, as these regions align with my current outreach strategy in the crypto and alternative-investment space. What matters most is quality and relevance: every address must belong to someone who genuinely meets a high-net-worth profile. While I’m open to any extra fields you can lawfully append (name, city, asset class interests, etc.), the core deliverable is the email itself. No phone numbers or portfolio breakdowns are required at this stage. Deliverables & acceptance criteria • 100,000 unique Hotmail addresses of confirmed high-net-worth investors • Supplied in CSV or Excel with at lea...
I need a skilled web scraper to gather phone contact information for sales inquiries. This data will be used primarily for leads generation. Key requirements: - Scrape phone contacts from specified sources - Ensure data accuracy and up-to-date information - Deliver data in a structured format (e.g., CSV, Excel) Ideal skills and experience: - Proficiency in web scraping tools and techniques - Experience with data validation and cleaning - Attention to detail and ability to meet deadlines Looking forward to your proposals!
Project Description: Looking for a skilled XPath rules expert to assist with data scraping from Website A. I need help building XPath rules to structure the data and troubleshooting the data scraping process. Ideal Skills and Experience: - Strong expertise in XPath rules and data scraping techniques - Proven experience in successfully scraping data from websites - Proficiency in troubleshooting issues related to data scraping - Attention to detail to ensure accurate and reliable data extraction - Excellent problem-solving skills to overcome any challenges that may arise during the process
Academic Research Report & Presentation Project Overview I am looking for a skilled freelancer to complete an academic project for (Management Information Systems). The project requires in-depth research on a selected technology topic, a formal written report (10–15 pages), and a PowerPoint presentation to be delivered on the submission date. Topic The freelancer will research and write about one of the following topics IN Saudi Arabia (to be agreed upon): • Data Mining & Big Data • E-Government • E-Business & E-Commerce • Intranet & Extranet • Collaboration & Social Business • Blockchain Technologies (Cryptocurrencies, Distributed Ledgers, etc.) • Cybersecurity • Mobile and Cloud Computing • Fintech (Proptech, Insur...
My sales reps close deals confidently once a prospect is on the phone, yet our pipeline in the cosmetics niche is too thin to hit this year’s revenue targets. I need a specialist who can build and execute a lead-generation engine that continuously feeds the team with qualified, purchase-ready contacts. The focus is strictly on driving new, high-quality leads—not general brand awareness. I manufacture and distribute skincare and makeup lines, so familiarity with beauty-industry buyer personas, retail and e-commerce channels, and the latest social/SEO trends will be very helpful. I’m open to any proven mix of tactics—data mining, LinkedIn prospecting, paid social, influencer collaborations, or creative sampling funnels—as long as they deliver measurable growth....
I need a Python expert to take several CSV and Excel files and turn them into solid, statistically sound insights. The work is focused entirely on data analysis and processing—no web scraping or app development—so your time goes straight into cleaning the raw tables, exploring patterns, and building the statistical models that answer my business questions. You should feel at home with pandas for wrangling, NumPy and SciPy for numerical work, and a modeling library such as scikit-learn or statsmodels to run regressions, clustering, or any technique you recommend. If you prefer working in a Jupyter notebook, that’s perfect; a well-commented .py script is also fine. I’ll supply the data files and a brief outlining the hypotheses I want tested the moment we start. Del...
We are a growing startup working on a highly private and confidential internal project. Due to the sensitive nature of this initiative, we require an experienced Cybersecurity Expert with strong background in advanced security operations, threat intelligence, and secure information handling Key Requirements: Proven expertise in cybersecurity, network security, and information protection Advanced skills in online research, deep investigation, and gathering intelligence from various web sources Experience with secure data collection, analysis, and management techniques Ability to work with complex security tools and environments while maintaining complete confidentiality Strong knowledge of anonymity practices, secure communication, and protective measures in challenging online scenarios...
Reply back. i have a Propstream Account I need a clean, ready-to-use lead file built from two sources—Propstream and Apollo.io. From Propstream, the focus is commercial properties and apartment assets; every record must include full property details plus up-to-date owner contact information. will serve as a second source so the final list contains additional prospects pulled from that platform (matching or related to the same asset classes). All data must be scraped, deduped, and formatted in a single spreadsheet so I can sort, filter, and launch campaigns immediately. Use whatever stack you prefer—Python, Selenium, BeautifulSoup, Apify, or similar—but the workflow has to respect each site’s TOS and deliver reliable results. Deliverables • CSV/Excel...
I have a list of roughly 1500 URLs—each coming from the same automotive website—that together cover the top 100 makes, models, grades and variants sold in Australia. I need every data point the site makes available for each of those vehicles, from the obvious specs such as year, make, model and variant right through to driveway prices, engine type, transmission, drive configuration, warranty details, fuel-economy figures, in-car technology features, seating layouts and any other attributes exposed on the page. The end goal is a clean, analysis-ready Excel workbook that lets me run market-wide comparisons, so consistency is critical: headings must be standardised, units normalised and categorical values written the same way across the entire sheet. I am happy for you to use P...
Freelancer Wanted: Australian Geographic Data & Mapping Web App We're looking for an experienced full-stack developer to build a sophisticated geographic data visualisation platform focused on Australia. This is a substantial project with multiple stages, and we're looking for someone who can commit to seeing it through end to end. What you'll be building A web-based mapping and data tool that lets users explore census and other datasets across Australian geographic areas — from the national level right down to individual mesh blocks. Think of it as a customisable, interactive community insights platform. Stage 1 – Geography layer Build on a Google Maps base and load in the full suite of 2021 ABS geographic boundaries, including mesh blocks, SA1s, SA2s, SA...
Job Title: Freelance Data Extractor / Virtual Assistant for Educational Materials Project Description: I am currently organizing extensive study materials and need a meticulous freelancer to help extract and map specific information from large documents. Your Responsibilities: Review Materials: You will be provided with comprehensive subject-wise PDFs and a corresponding "blueprint" document for each subject. Keyword Matching: Use the specific keywords listed in the blueprints to search through the PDFs. Data Extraction: Extract the exact topics, paragraphs, or sections from the PDFs that align with those keywords. Formatting: Compile the extracted information into a structured, easy-to-read format directly into my Notion workspace (or a standard Google Doc/Word file), organized ...
I have already gathered a sizeable spreadsheet of phone numbers, email addresses and IP addresses; now I need all of those crumbs stitched together so I can see the real-world picture behind them. The most valuable outcome for me is showing how each digital identifier connects to WiFi networks, SSID names and the physical addresses where those networks sit. You will receive - CSV files containing the phone, email and IP data. - A handful of screenshots and documents whose original EXIF and other metadata are still intact. - A brief note on the investigative context so you understand why specific links matter. What I need back - A clear, well-structured report that walks through every correlation you uncover, complete with confidence levels and the logic or tool output that supports each ...
โปรดลงทะเบียน หรือเข้าสู่ระบบ เพื่อดูรายละเอียด
I want you to select for me a research paper from 3 website that I will share with you after that, I need to do a presentation paper and I will share with you the guide guidelines, and if you can share with me like notes and things we need to add in the presentation, however, the deliverable will be PowerPoint presentation and the paper with highlighted the answers and everything so it become easy for me to map the information when it’s highlighted make sure do not use AI tool to rewrite the presentation because I have a eye detectors
We need an experienced Azure Data Engineer to design and implement robust data solutions on the Azure platform. You'll work on ETL processes, data analytics, and collaborate with cross-functional teams to deliver scalable data engineering solutions. Requirements: • Strong experience with Azure Data Factory (ADF) for ETL processes • Proficiency in Azure Databricks for advanced analytics • Hands-on experience with Azure Data Lake Storage (ADLS) • Experience with Azure Synapse Analytics for real-time analytics • Strong SQL skills for querying and database optimization • Python programming for scripting and automation • Experience with data modeling and data warehousing concepts • Excellent communication skills to work with stakeholders • Bach...
We need an experienced Informatica BDM developer to join our team for full-time contract work supporting data engineering and ETL development projects. Requirements: • 7+ years of experience with Informatica Data Engineering, DIS and MAS • Strong expertise in Databricks and Hadoop ecosystems • Proficiency with relational SQL and NoSQL databases (Azure Synapse, SQL Server, Oracle) • Experience with major cloud platforms (Azure, AWS, or Google Cloud) • Knowledge of Agile methodologies and tools like SCRUM, TFS, and JIRA • Advanced SQL skills including T-SQL and PL/SQL • Experience building and optimizing big data pipeline architectures • Hands-on experience developing both batch and real-time workloads • Knowledge of Data Lake and dimensional dat...
I’m looking for a fresh 2026 updated dataset of U.S. residential contacts that I can drop straight into my campaigns without worrying about bad emails or dead phone numbers. The file must cover every state and include only individuals aged 45 or older. For each contact I need a full name, a working email address, and a reachable phone number. Approx data count according to the USA demographics should be around 100-120 millions I’ll be running the list through ZeroBounce and similar verification tools, so I expect a true 0 % bounce rate. Please deliver everything as a statewise well-structured CSV so I can import it immediately. Deliverables • One/Multiple CSV containing validated records (name, email, phone, age 45+) for all 50 states • Verification repo...
We are hiring remote contributors to create photo-based language data using everyday materials found around you. This project focuses on collecting natural, real-life text captured through a phone camera. What You’ll Do - Photograph common objects that contain written text (printed or handwritten). - Provide three unique shots per item, changing position, distance, or lighting. - Ensure content is original and varied. - Most of the visible text (minimum 75%) must be in your local language. Eligibility - Fluent in the target language (native or near-native). - Physically located in a country where the language is used. - Own a smartphone capable of taking clear photos. How It Works - Upload images through a Google Form. - Submissions are reviewed individually. - Only valid, clear, ...
บทความแนะนำสำหรับคุณโดยเฉพาะ
How user testing can make your product great
Get your product into the hands of test users and you'll walk away with valuable insights that could make the difference between success and failure.