Daft ie scrapeproiecte
Am nevoie de un script care sa faca "scrape" in fb ads prin, API sau nu, dupa un cuvant cheie (predefinit) si sa salveze datele centralizat ori intro baza de date sa poata fi sortata dupa criteriile de mai jos ori intr-un fisier excel. Date necesare in tabel: - data publicarii reclamei - link din reclama - link catre reclama fb ads - nr de reclame rulate pe creativul respectiv (CEL MAI IMPORTANT ASPECT) - text reclama - link video.
Am nevoie de datele din casetele box score aferente meciului cat si de cele aferente fiecarui dori sa pot aduce datele pentru un anumit seson ales ex (2019-2020) sa il pot schimba si sa aduc date pentru oricare sezon doresc.
Trebuie să scot cote din 3 case de pariuri românești (Superbet, Betano, Fortuna), sporturi (fotbal, tenis, baschet, hochei, handbal), cu o rată de actualizare în software-ul meu de minim 5 minute. Trebuie să compar această cotă, cu cote medii de la pentru fiecare meci, am nevoie de toate tipurile de pariuri și cote disponibile în Odds Portal pentru un joc specific și să le compar cu aceleași tipuri de pariuri de la Bookies românești pentru acelasi joc . Am nevoie de filtru pentru (ore înainte de începutul jocului, Valoare, Comutare între Bookies Românești, Odds Drop (asta înseamnă că cred că trebuie să stocați cotele vechi și să le comparați cu cele noi după actualizare). Am nevoie și de comparație între valori între cote ...
I need a Tamper monkey script that will do the following: Please ensure you watch the video in full before replying to me. This task is urgent, I will expect it to be completed inside 2-3 hours. If you are confident you can co...urgent, I will expect it to be completed inside 2-3 hours. If you are confident you can complete this up to the task, let me know and I will authorise a project for you. Finished Product Requirements: - Do exactly as explained in the video - Ability to select invoices using tick box and click "Delete" - No additional command or input required from the user - Be written in an approved platform (ie Tampermonkey) - Work 100% Please ensure you understand these requirements before commencing work.
Caut o peroana care face ilustratii desenate alb negru gen moda. Astept oferte si portofoliu. Multumesc
nu caut neaparat traducatori autorizati dar doresc dovezi ca traduceti bine. ma intereseaza traducerea website-ului (partea importanta din el.. nu e nevoie de 100%.. de exemplu nu e nevoie sa fie tradusa pagina de contacts). ma intereseaza un pret cat mai mic. puteti folosi si google translate pentru a va ajut...neaparat traducatori autorizati dar doresc dovezi ca traduceti bine. ma intereseaza traducerea website-ului (partea importanta din el.. nu e nevoie de 100%.. de exemplu nu e nevoie sa fie tradusa pagina de contacts). ma intereseaza un pret cat mai mic. puteti folosi si google translate pentru a va ajuta
Caut pe cineva care sa imi faca un site in wordpress. Mai exact, o tema, si instalate niste plugin-uri. Site-ul trebuie: - sa fie responsive - sa imi permita sa fac ulterior schimbari in pagini - sa fie cross-browser compatible (nu ma intereseaza versiunile vechi de IE, dar ma intereseaza in mod special Safari) Atentie! Nu toate paginile au acelasi design!
...PRIN CONSOLA API YOUTUBE PRIN CONSOLA api twetter prin consola api p Pinterest prin consola api 4 Crearea site ului responsive ptr telefon si tabelta 5 delimitarea foarte clara zonelor de lucru , unei harti personalizate cu punctele de lucru : o singura harta ,26.114845&spn=0.036751,0.051498&z=13&source=embed&dg=feature SEO ON PAGE 7 trimitere catre pagina de inscrieri mai accentuata 8 timp scurt de incarcare 9 OPTIMIZARE H1 H2 , METADESCRIPTION , METATITLE PENTRU FIECARE PAGINA DIN SITE 9 subpagini pentru selectie show uri dansatori 10 sub pagini pentru cursuri dansul mirilor 11 respectarea temei actuale a site ului
...PRIN CONSOLA API YOUTUBE PRIN CONSOLA api twetter prin consola api p Pinterest prin consola api 4 Crearea site ului responsive ptr telefon si tabelta 5 delimitarea foarte clara zonelor de lucru , unei harti personalizate cu punctele de lucru : o singura harta ,26.114845&spn=0.036751,0.051498&z=13&source=embed&dg=feature SEO ON PAGE 7 trimitere catre pagina de inscrieri mai accentuata 8 timp scurt de incarcare 9 OPTIMIZARE H1 H2 , META PENTRU FIECARE PAGINA DIN SITE 9 subpagini pentru selectie show uri dansatori 10 sub pagini pentru cursuri dansul mirilor 11 respectarea temei actuale a site ului 100% 12 respectarea grafiicii
...com/+/web/api/javascript FACEBOOK PRIN CONSOLA API YOUTUBE PRIN CONSOLA api twetter prin consola api p Pinterest prin consola api 4 Crearea site ului responsive ptr telefon si tabelta 5 Crearea unei harti personalizate cu punctele de lucru : o singura harta ,26.114845&spn=0.036751,0.051498&z=13&source=embed&dg=feature SEO ON PAGE 6 pozitionarea cuvintelor in top 3: 1 cursuri de dans bucuresti 2 cursuri de dans pentru copii 3 scoala de dans copii 4 cursuri de dans pentru nunta 5 valsul mirilor 6 dansul mirilor 7 dansatori profesionisti pentru evenimente 8 cursuri de balet 9 cursuri balet pentru copii 10 scoala de dans bucuresti
For an upcoming market research study, I need a fully-automated workflow that gathers and enriches data from well over 500 LinkedIn profiles. The automation should locate the profiles that match criteria I will provide, pull the key public details, then append reliable off-platform contact information so I can reach those professionals directly. Please design the script or low-code sequence with any reliable stack you prefer—Python, Selenium, PhantomBuster, Sales Navigator API, or comparable tools are fine as long as the method is repeatable and respects rate limits. Deliverables • CSV/Excel file containing one row per person with: – Current job title – Company name – Verified email (and phone, when available) • Source code or workflow fi...
I need two automated, repeatable scraping pipelines. The first will harvest business names, full contact information, user reviews with rating...headless behind rotating proxies without tripping rate limits. Deliverables: • 4 working scripts (Maps + websites) with clear setup instructions • Sample output files proving all requested fields are captured correctly • Output data must have City Name > (Excel file with list of date + Folders with images of each business data) • A short read-me covering dependencies and how to rerun the scrape with new location or URL inputs • Each Script will be paid Rs.2500 including support I will review by loading your CSVs and spot-checking a random sample against the live pages; accuracy and field completeness ...
...through all job listings, including: Pagination (page numbers, next/previous, etc.) “Load more” buttons Infinite scrolling Ability to fetch data from multiple pages (e.g., page 3, 4, or beyond) Apply job filters, especially location-based filtering, so that only job links for specific locations are collected Extract only individual job posting links after filters are applied Visit each job link and scrape complete job details, including: Job title Job description Location Employment type (if available) Department / team Any other relevant metadata Additional Requirements Script should be scalable for hundreds or thousands of companies Handle different career page structures (custom sites, Greenhouse, Lever, Workday, etc.) Support static and dynamic content (HTML + Jav...
My service company relies on SupplyP...appear in my Housecall Pro dashboard with correct dates and times. • No duplicate records are created on subsequent runs. • Configuration variables (API keys, endpoints, schedule frequency) are isolated in a single settings file or environment block. I have full API credentials and documentation for Housecall Pro but nothing similar for SupplyPro. If their platform lacks an open API, a screen-scrape, CSV export, or other creative workaround is fine as long as it’s reliable and keeps my data secure. Let me know how you would approach the pull from SupplyPro, any libraries or tools you would lean on (Python requests, Node, Zapier hooks, AWS Lambda, etc.), and an estimated timeline for a simple proof of concept followed by pr...
I own a garage website currently sitting on my GoDaddy hosting. The code base is Custom HTML/CSS and, unfortunately, the original developer is now charging extortionate fees while keeping hold of the files and database. I need a skilled web developer who can: • Scrape or export every line of code, text, and image from the live site • Re-create the database so I regain full control of customer bookings and quotations • Deliver an enhanced design—fresh look, modern UI, fully mobile responsive • Add a dedicated HGV calendar so lorry MOT slots can be managed separately from standard vehicle work • Integrate Elavon for secure online payments alongside our existing booking and quoting flow The current system already lets customers book repairs and ...
I am launching a small (20-member, soon larger) WhatsApp engagement group for TikTok creators and I need a fast, transparent way to see who actually comments on every reposted video. What I need built • A Google Sheet or Excel file that lets me paste each day’s repost link and, in minutes, shows a gre...account • Clear, step-by-step doc (or short screen-recording) showing how to add new links, adjust the member list, and trigger notifications • Brief support window after delivery to be sure everything runs smoothly I value speed, fairness, and simplicity, so clean code, minimal manual clicks, and an intuitive layout are more important to me than fancy visuals. If you have a proven way to scrape TikTok comment data into Sheets/Excel and can set up the...
Please Read Carefully Before Applying It does not matter whether you consider yourself a “vibe coder” or a traditional software engineer we accept both here. What matters is whether you can make this system work reliably at scale. We operate a production scraper that processes 500+ leaderboard sites per hour. All sites we scrape are leaderboards, but no two sites are the same. This is not a basic scraper. What Makes This Scraper Different The leaderboards we scrape vary heavily in structure and behavior: Dynamic buttons, tabs, and switchers JavaScript-rendered content Hybrid navigation (UI interaction + background API calls) Tables, card layouts, podium layouts, or combinations of all three Masked usernames and inconsistent rank formats Different ordering of wager...
B2B Lead Generation & Verification – NYC Event Venue Project Description: I’m looking for a detail-oriented lead generation specialist to manually research and verify potential clients for a premium NYC event venue (). This is not a Google Maps or database scrape—accuracy and validation matter. Scope of Work: Provide a spreadsheet with [e.g., 200] fully qualified B2B leads relevant to hosting events in NYC. Each lead must include: Business Name Decision-Maker Name (Owner, Event Planner, or Manager — no generic contacts) Verified Email and/or Direct Phone Number Category Tag (see below) Target Categories: Private Chefs (pop-ups, tastings, chef-led dining experiences) Wedding Planners (NYC-focused, mid- to high-end) Corporate / Event Planners (off-sites, br...
I need clean, structured product details pulled from the web and delivered as a single, well-formatted CSV file. The focus is strictly on product information—no contact data or social media posts are required—so your crawler can stay lean and purpose-built. I already have a clear idea of the attributes I want captured (title, price, SKU, description, availability, image URL). Once we agree on the target sites, you can build a scraper, run it, and hand back the CSV along with the script or notebook so I can reproduce the results later if needed. Please let me know: • Which language or framework you plan to use (Python, Scrapy, BeautifulSoup, Selenium, Playwright, etc.). • How you’ll handle pagination, anti-bot measures, and site structure changes. • ...
I need a clean, well-structured extract of permit holder information from the WA State Labor & Industry online permit lookup (sometimes called the Permit Center). Whether you can do a fully automated scrape or need to do a manual pull is up to you—the key is accuracy and complete coverage. Scope • Visit the WA State L&I electrical permit lookup site and capture every record that appears in the public search results that: - Is for a generator or automatic transfer switch installation. - For the license numbers that will be given to you - for the timeframe given (5-6 years back). • Extract only the permit holder–related fields (name, address, and any other holder-specific details that the site exposes). • Retu...
...are mandatory. - There will Tech freedom I haven’t settled on Claude, OpenAI, or any other foundation model yet. I’m open to your guidance on which stack best balances accuracy, cost, and future scalability, provided everything runs on a maintainable cloud or container platform. Core deliverables • Architecture and tooling proposal with rationale • Data-ingestion pipelines that screen-scrape at scale and connect to our databases • Machine-learning layer that turns raw inputs into trends, forecasts, and anomaly flags • Automated report generator (dashboards, PDFs, or both) running on a schedule • Deployment scripts and hand-off documentation Acceptance criteria 1. Daily analytics complete in under 30 minutes without manual int...
Înregistrează-te sau Conectează-te pentru a vedea detaliile.
I need a developer to collect data from multiple public websites and deliver it in a clean, structured format. This is for legitimate data extraction from publicly available pages. I will share the target URLs and exact data fields with shortlisted candidates. Scope of work Scrape data from multiple public websites (details shared after shortlisting) Extract specific fields consistently and handle pagination/filtering where needed Normalize/clean the data (remove duplicates, consistent formatting) Export results to CSV/Excel/JSON (format to be confirmed) Provide a repeatable solution (script or small app) that I can run on demand Basic documentation: how to run it, how to adjust settings, where outputs go Quality requirements Reliable scraping with error handling and retrie...
... • Push a working version to my Apify account with clear release notes describing every code change. Acceptance A single test run on 50 product URLs must complete with 100 % fill-rate on those three fields, no unhandled errors in the log, and an average page processing time in line with current Apify best practices. If you have recent experience fixing DOM-breaks in live Apify actors that scrape large sites such as eBay, this will be a straightforward task. Let’s get the scraper fully functional again....
I have an urgent need for a clean, well-structured dataset containing the listing agent’s first name, last name, mailing address, and phone number for well over 500 active Zillow listings. Speed is critical, but accuracy matters just as much; the final file should be ready for immediate import into my CRM. You are free to use whichever stack you prefer—Python with BeautifulSoup or Scrapy, Selenium, residential proxies, even the unofficial Zillow API—so long as rate-limits are respected and the data is complete. I don’t need property details or price history; the focus is strictly on the agent contact fields. Deliverables • CSV or XLSX with a separate column for each required field • A short read-me explaining the script or method so I can rerun it la...
I’m running FVWM3 on Linux and use the Perl-based FvwmTabs module. To make the workflow better, I need three improvements: • Fix FvwmTabs ability to preserve its state (ie. remember the windows it has swallowed as tabs) b/w fvwm restarts. To create the temporary file specified in the FvwmTabs file (line 188). stateFile => $ENV{FVWM_USERDIR} . '/.', • Click-to-focus tabs: switch the current focusFollowsMouse behaviour to an explicit click-to-focus model. • Externalised configuration: move the options defined between lines 165–201 (starting with `my $configTracker = $fvwm->track('ModuleConfig', ...`) into a separate, editable file while preserving full functionality. • Fix menu options: make tab options tags in the menu ...
...built: A tradingview script to create a push notification the notification is to be generated at the user defined time/times at the time the ticker selected should then compare the current bar with a previous close (time frame of close is to be user selected) the notification should state the tickers status at the time of notification in relation to the previous in direction, value and percentage ie "Ticker X is +10/3%" the notification should be a pop up in the tradingview app...
... Here are some FAQs for this project: 1. Should PII (ie names, signature, address, headshot photo, etc) be redacted? If yes, what is the required format? - Yes, this should be redacted or otherwise hidden. No PII may be visible on delivered assets – participants may cover PII or supplier may redact after collection. Barcode and merchant name should be visible. - Have participant cover the PII - needs to be obfuscated, no annotation or redaction format required 2. Some stores (ie Costco) use credit cards that double as their membership ID, is this acceptable? - This is acceptable if their card number and any PII is removed. 3. Are day passes / tickets with
I'm seeking an automation expert, to build several marketing and content posting workflows ...automated for each account. Ideal skills and experience: - Proficiency in n8n, sumopod, Canva, Generative AI APIs - Experience with AI agents in automation - Strong understanding of social media automations - Background in setting up automation workflows for content posting and marketing campaigns Looking forward to your proposals! This will be a set price for each automation you set up (ie 1 tiktok account = $xx). We have 6 accounts to set up immediately. Followed by several other projects immediately following the completion of this first project. YOU WILL BE ASKED TO DEMONSTRATE YOUR SKILLS. PLEASE DO NOT BID IF YOU DO NOT HAVE EXPERIENCE IN FULLY-AUTOMATED SOCIAL MEDIA...
Hi Tanmay, I want to update a few things on this website. 1. The logo, it has now been trademarked, so its a small change with the Registration symbol added. 2. The Google Review are not linked, home page and Testimonials pages are displaying different numbers of reviews. ie Testimonial page not updating. 3. Need to update email header graphic with new logo: Will add login details in chat.
...Building) Develop a high-quality backlink strategy focusing on science, biohacking, and health-related publications. Note: No "link farms" or spammy guest posts. We need niche-relevant authority to survive algorithm updates. 3. GEO (Generative Engine Optimization / AI Visibility) Optimize our content for Retrieval-Augmented Generation (RAG)—ensuring AI bots (ChatGPT, Gemini, Perplexity) can easily scrape and cite our site. Focus on Entity Health: Improve our brand's "citations" across the web so AI models recognize us as a reputable entity in the peptide space. Format FAQ sections to answer conversational "long-tail" AI queries (e.g., "What is the shelf life of lyophilized peptides?"). Required Experience Proven track reco...
I'm looking for a freelancer to scrape data from Washington state electrical permit applications to be used for marketing. Requirements: - Details important - full name and address must be collected with permit# and date - List to be built so that date merge can be incorporated for printing labels. Ideal Skills: - familiar wih Microsoft office eXAMPLES OF iNPUT SCREEN AND DATA SCREEN ATTACHED. All required licenses and dates will be provided
I have already built...story. Think of it as taking a nearly-finished puzzle and snapping the last pieces into place. If we work together, here’s what I’d like to walk away with: • A clean, intuitive layout on every page, with headings and sections arranged for easy reading. • Consistent positioning of text blocks and images so the site looks polished rather than patched together. • Ensure all links and landing pages(ie checkout pages) are functioning correctly • Clear guidance (or hands-on edits) for any remaining content I should add before launch. I can give you full access to the current build, explain the goals of each page, and respond quickly to your questions or feedback. Once the layout feels solid, I’ll handle any final conte...
I need a small, reliable script that pings the Late Show with Stephen Colbert page on 1iota every 30 seconds and fires off an SMS the moment May 21 tickets appear. The job is straightforward but time-sensitive: • Scrape or query the specific event listing without triggering 1iota’s bot protections (Python with requests/BeautifulSoup, Playwright, or Selenium are all fine—use what keeps the check time low). • Parse the response and confirm that the date equals 21 May before treating it as a positive match. • Send a single, immediate SMS alert to my phone via Twilio (or another SMS gateway you’re comfortable with). The script must run unattended on a Mac or Linux box—so include setup instructions, any required environment variables, and ...
...use for integrations), let’s work together. Once connected, I’ll handle the filtering logic inside , but I need your account to serve as the data source and, ideally, your guidance to be sure the pull limits stay within LinkedIn’s acceptable use. When you reply, focus on your experience—how you’ve successfully linked Sales Navigator with automation platforms before, any anti-scrape precautions you follow, and typical daily search volumes you’ve handled without issues. If you can demonstrate that your account is stable and won’t be at risk of restriction while we’re running, you’ll move to the top of the list. Deliverables (brief and concrete): • Verified connection of your Sales Navigator account to my scenario. &b...
...items). Key Requirements List for Your Records: Speed: Under 2-second load time. Integration: Connection to shipping carriers (e.g., DPD, An Post). Trust: Professional "About Us," "Privacy Policy," and "Terms of Service" pages. Scalability: The ability to add new brands and categories easily without breaking the site. Pro-Tip for your Freelancer Post: Since your references are mostly Irish (.ie) sites, tell the developer they must ensure the site is GDPR compliant and handles Irish VAT (23%) correctly at checkout....
...reference a **LabVIEW While Loop** - The loop must be an **integrated structural element**, not decoration ## Reference Design (Required) An image created by us is attached and serves as the **design baseline**. The loop around the outside of the image must exist in the final version, as this matches the text within that says "Stay in the loop!" ## Color Palette (Required) Windows 11 theme colors. ie. think blue/greys/white/black ## Tone & Feel (Very Important) The final image must feel: - Fun - Welcoming - Community-focused …but **NOT**: - Loud - Noisy - Flashy - Sales-driven Think **calm confidence**, not hype. Avoid visual clutter, excessive effects, or aggressive calls to action. ## Text Content The following text must appear as written in the att...
I need to scrape a website with public content and export it to an organized Excel file. It's approximately 700k pages with specific data. Some of the data I need is missing, but I have an Excel file with this data that should be used to autocomplete the missing information. In summary: 1. Scrape the website to an Excel file (I will give an example) 2. Autocomplete the missing information based on my Excel file. After this project, I will need another one, which will require finding contact information with some precision, perhaps using AI or some specific logic, but that will be a topic for later. Thank you all
...fields, so feel free to leave those out if they’re present. This is not a fresh scrape request; I only want data that you currently have on hand. Anything compiled within roughly the last twelve months is perfect, but older archives could still be useful if they’re large and well-structured. I’m flexible on format: CSV, Excel, or JSON all work. Just let me know which one your files are in, when the data was pulled, and roughly how many records there are. To keep the process smooth, please provide a short sample (about 50–100 rows) so I can verify the structure and content before we finalise. Deliverables • Full file(s) containing only the contact information fields • Brief README or note outlining scrape date, number of records, and f...
Squarespace designer/developer to recreate an existing Squarespace website based entirely on publicly available pages and content. We do not have access to the original Squarespace account or source files. The rebuild will be done using visual reference, captured public content, and brand assets we already own, inside a new Squarespace account. This is a straightforward reconstruction project, not a scrape or migration. Scope of Work Rebuild the full website in Squarespace (page-by-page) Match layout, typography, spacing, and responsive behavior Recreate navigation, footer, and global styles Rebuild CMS collections (e.g., blog, case studies, podcast, or similar) Apply light custom CSS if needed for fidelity Ensure mobile, tablet, and desktop responsiveness QA and polish prior to ...
...to page views to clicks to conversions to revenue, combining GA4, BigCommerce and Netsuite data Future Plans (For Context) You don't need to build this yet, but the architecture should be ready for it later: Marketing data (FB/Google Ads) - capability to start building our own attribution and analytics tools Email data (Dot Digital) - to combine email marketing and referral metrics with outcomes ie BigCommerce and Netsuite Competitor pricing scrapes - the ability to match competitor price scrapes to our own product data and eventually set up some automatic price rules. This could include other types of external data in excel, sheets, csv, etc What You'll Hand Over By the end of the project (aiming for ~8 weeks), we expect: The Infrastructure: GCP project set up wit...
I need a fresh supply of consumer leads from within the United Kingdom, and the key requirement is that every record has been collected through live telephone interaction—either a telesurvey or any other voice-based activity. Online-only sources, web scrape data or legacy databases will not be accepted. Please supply a clean, deduplicated file (CSV or XLSX) containing the following headers exactly as written: • Title • First Name • Last Name • Address Line 1 • Address Line 2 • Address Line 3 • Town • County • Phone Number (mobile preferred) • Age • Opt-in Date • Opt-in Source Acceptance criteria • All numbers must be active and connected to UK networks. • Opt-ins must be within...
...customer-facing role. The work consists of: * Small, clearly scoped Python scripts * Web scraping (HTML, PDFs, APIs) * Data cleaning and transformation * ETL-style utilities All work is: * Async-first * Internal tools only * Clearly scoped with written requirements This is **ongoing contract work**. Strong performers may receive long-term work. --- ### What You’ll Be Doing * Build Python scripts to scrape public websites * Parse HTML, JSON, CSV, and PDF files * Clean and normalize messy real-world data * Write clear, maintainable utility scripts * Deliver working code (not just prototypes) --- ### Required Skills * Strong Python fundamentals * Real experience with web scraping * Data parsing and data cleaning * Comfortable working independently and async --- ##...
...unused default ListingPro fields (restaurants, events, etc.) SEARCH AND FILTERS - Configure filters for city, state, course, ownership, and institution type - Ensure filters are optimized for large datasets (50,000+ listings) - Avoid heavy meta queries and performance bottlenecks LEGAL AND COMPLIANCE REQUIREMENTS (MANDATORY) - Add a global website disclaimer - Add a per-listing disclaimer - Do not scrape data, logos, or images from other websites - Disable logo uploads by default users - Allow image uploads only after admin approval - Implement a proper “Claim this listing” workflow with email verification and admin approval CLAIM LISTING FUNCTIONALITY - Institution can request to claim a listing - Email verification is required - Admin approval is mandatory - Clai...
...same every day) Your job is to scrape ALL matching businesses in that city across those industries, at scale, and deliver clean, call-ready datasets for our sales team. Each record must include the following: - Business Name - Industry - Address - City - State - ZIP Code - Website URL (if available) - Phone Number -Business Owner (if possible) Accuracy matters — this data is used for outbound calling. Volume Expectations - Thousands of businesses per day - Multiple industries per city - Must scale reliably without manual scraping - Ability to repeat this process daily for new cities This is not a one-time scrape — this is an ongoing system. Responsibilities - Use advanced AI tools, automation, and scraping frameworks to collect business data at scale - S...
I want to scrape specific data from the attached Excel file.
...Python (requests / BeautifulSoup or Selenium) or Scrapy workflow is fine as long as the final output arrives in a single Excel file (.xlsx) that opens error-free in Microsoft Excel. Accuracy matters more than speed—random spot checks will be run. Any duplicates, blanks, or malformed addresses will be sent back for correction. Once the first 5,000 pass review, I’ll green-light the rest of the scrape so we can wrap the project quickly....
Description: - We are looking for an experienced Data Scraping / Web Scraping expert. - We will share the industry name, and the freelancer should: - Suggest suitable websites/sources to scrape - Suggest countries/regions that can be covered - Share estimated data volume & approach - After approval, the freelancer will scrape and deliver clean, structured data. Data Required (example): - Company name - Location - Contact details (email/phone/website – if available) Requirements: - Proven experience in data scraping - Knowledge of Python, Scrapy, Selenium, APIs, etc. - Ability to scrape multi-country data (based on feasibility) Deliverables: - Data in Excel / CSV / Google Sheets - Basic info of sources used To Apply, share: - Similar scraping work - Tools...