Scraping data goldenpages ieproiecte
Salut, Sunt in cautatarea unui software developer pentru un proiect bazat pe web scraping. Ati fi interesat de o colaborare? Multumesc Dumitru
Salut, Sunt in cautatarea unui software developer pentru un proiect bazat pe web scraping. Ati fi interesat de o colaborare? Multumesc Dumitru
Salut, Sunt in cautatarea unui software developer pentru un proiect bazat pe web scraping. Ati fi interesat de o colaborare? Multumesc Dumitru
Salut, Sunt in cautatarea unui software developer pentru un proiect bazat pe web scraping. Ati fi interesat de o colaborare? Multumesc Dumitru
Buna Ziua, Sunt in cautatarea unui software developer pentru un proiect bazat pe web scraping. Ati fi interesat de o colaborare? Multumesc Dumitru
Colectarea de date de pe diverse website-uri si afisarea lor pe anumite pagini proprii.
I need a Tamper monkey script that will do the following: Please ensure you watch the video in full before replying to me. This task is urgent, I will expect it to be completed inside 2-3 hours. If you are confident you can co...urgent, I will expect it to be completed inside 2-3 hours. If you are confident you can complete this up to the task, let me know and I will authorise a project for you. Finished Product Requirements: - Do exactly as explained in the video - Ability to select invoices using tick box and click "Delete" - No additional command or input required from the user - Be written in an approved platform (ie Tampermonkey) - Work 100% Please ensure you understand these requirements before commencing work.
Detin un cont de olx cu peste 1200 de articole postate, as dori produsule acestea sa le import intr-o pagina web - as dori ca acea pagina web sa contina toate produsele postate in contul de olx dar sa permita clientului sa comande prin site
Salut! Sunt interesat de un data entry pentru un site care foloseste metoda de web scraping. Am vazut ca te pricepi pe acest domeniu. Cum putem discuta mai detaliat?
Obiectivul scriptului este sa scrap oricare website si sa listeaza toate cookies care sunt pus. Inaitea sa incepem scraping, sa indicam cate pagini are situl. Daca scriptul gaseste mai mult decat numar, sa opreste. Pentru fiecare cookie gasit, trebuie sa scrie: - nume cookie-ului - situl care l-a pus - type (html...) - cat timp dureaza - url completa unde a fost gasit - descriptia - unde sunt trimis datele Datele sunt puse in database.
Avem mai multe situri ce au produse identice cu ale noastre si vrem sa eficientizam munca.
Salut, Am observat pe profilul tau ca te ocupi si de web scraping. As avea un astfel de proiect care implica un numar mare de date (56393 de linii mai exact) care sunt disponibile in mai multe limbi in cadrul unui website. Pentru fiecare produs disponibil as avea nevoie de o serie de date specifice. Imi trebuie solutia, pe care pot sa o execut eu. Daca iti permite timpul, te rog nu ezita sa ma contactezi si putem discuta orice alte detalii de care ai nevoie. Multumesc, Andrei
Avem nevoie de cineva pentru a face scraping/crawling unui site de aproximativ 10000 produse, apoi datele sa fie introduse in magazinul Magento. Datele introduse trebuie să aibă Sku, nume, pret / pret special, dimensiuni, culori, descriere, imagini. Lucrările ar trebui să se facă prin intermediul Magento API, astfel încât unele dintre datele (matimi, preț, culori) să fie actualizate periodic și în mod automat (de exemplu: în fiecare zi sau în fiecare săptămână) Nu pot da tuturor adresa URL a site-ului care urmeaza sa fie o va primi cel selectat. Cunostinte necesare: php, html, xml, json, Api
Am nevoie de o persoana care a mai facut web scraping pentru un site de piese auto
Am nevoie de o persoana care a mai facut web scraping pentru un site de piese auto
Salut, as avea nevoie de un programator pe termen lung, pe baza de abonament sau proiecte. Este vorba de prelucrare de feed-uri pt. ecommerce, api-uri, scraping, optimizare etc. Profilul tau corespunde cu ce ma intereseaza. Bugetul alocat si timpul de executie sunt greu de definit, ar trebui sa avem o discutie punctuala pe task-uri. Daca te intereseaza, te rog sa-mi raspunzi, multumesc frumos.
Salut, as avea nevoie de un programator pe termen lung, pe baza de abonament sau proiecte. Este vorba de prelucrare de feed-uri pt. ecommerce, api-uri, scraping, optimizare etc. Profilul tau corespunde cu ce ma intereseaza. Bugetul alocat si timpul de executie sunt greu de definit, ar trebui sa avem o discutie punctuala pe task-uri. Daca te intereseaza, te rog sa-mi raspunzi, multumesc frumos.
Proiect nou pe Osclass, necesita adaptari de module in tema, dezvoltare script, data mining/scraping, implemetare plata cu cardul si altele.
Caut o peroana care face ilustratii desenate alb negru gen moda. Astept oferte si portofoliu. Multumesc
nu caut neaparat traducatori autorizati dar doresc dovezi ca traduceti bine. ma intereseaza traducerea website-ului (partea importanta din el.. nu e nevoie de 100%.. de exemplu nu e nevoie sa fie tradusa pagina de contacts). ma intereseaza un pret cat mai mic. puteti folosi si google translate pentru a va ajut...neaparat traducatori autorizati dar doresc dovezi ca traduceti bine. ma intereseaza traducerea website-ului (partea importanta din el.. nu e nevoie de 100%.. de exemplu nu e nevoie sa fie tradusa pagina de contacts). ma intereseaza un pret cat mai mic. puteti folosi si google translate pentru a va ajuta
Caut pe cineva care sa imi faca un site in wordpress. Mai exact, o tema, si instalate niste plugin-uri. Site-ul trebuie: - sa fie responsive - sa imi permita sa fac ulterior schimbari in pagini - sa fie cross-browser compatible (nu ma intereseaza versiunile vechi de IE, dar ma intereseaza in mod special Safari) Atentie! Nu toate paginile au acelasi design!
Ma intereseaza cineva care stie sa utilizeze scrapingul pe platforma OpenCart si sa ma ajute sa o implementez si sa o folosesc corect . Tin sa precizez ca am deja pus magazinul online pe opencart si am cumparat modulul cu MultiScraper Pro este instalat doar ca nu prea stiu sa il folosesc. Multumesc
...PRIN CONSOLA API YOUTUBE PRIN CONSOLA api twetter prin consola api p Pinterest prin consola api 4 Crearea site ului responsive ptr telefon si tabelta 5 delimitarea foarte clara zonelor de lucru , unei harti personalizate cu punctele de lucru : o singura harta ,26.114845&spn=0.036751,0.051498&z=13&source=embed&dg=feature SEO ON PAGE 7 trimitere catre pagina de inscrieri mai accentuata 8 timp scurt de incarcare 9 OPTIMIZARE H1 H2 , METADESCRIPTION , METATITLE PENTRU FIECARE PAGINA DIN SITE 9 subpagini pentru selectie show uri dansatori 10 sub pagini pentru cursuri dansul mirilor 11 respectarea temei actuale a site ului
...PRIN CONSOLA API YOUTUBE PRIN CONSOLA api twetter prin consola api p Pinterest prin consola api 4 Crearea site ului responsive ptr telefon si tabelta 5 delimitarea foarte clara zonelor de lucru , unei harti personalizate cu punctele de lucru : o singura harta ,26.114845&spn=0.036751,0.051498&z=13&source=embed&dg=feature SEO ON PAGE 7 trimitere catre pagina de inscrieri mai accentuata 8 timp scurt de incarcare 9 OPTIMIZARE H1 H2 , META PENTRU FIECARE PAGINA DIN SITE 9 subpagini pentru selectie show uri dansatori 10 sub pagini pentru cursuri dansul mirilor 11 respectarea temei actuale a site ului 100% 12 respectarea grafiicii
...com/+/web/api/javascript FACEBOOK PRIN CONSOLA API YOUTUBE PRIN CONSOLA api twetter prin consola api p Pinterest prin consola api 4 Crearea site ului responsive ptr telefon si tabelta 5 Crearea unei harti personalizate cu punctele de lucru : o singura harta ,26.114845&spn=0.036751,0.051498&z=13&source=embed&dg=feature SEO ON PAGE 6 pozitionarea cuvintelor in top 3: 1 cursuri de dans bucuresti 2 cursuri de dans pentru copii 3 scoala de dans copii 4 cursuri de dans pentru nunta 5 valsul mirilor 6 dansul mirilor 7 dansatori profesionisti pentru evenimente 8 cursuri de balet 9 cursuri balet pentru copii 10 scoala de dans bucuresti
...outbound marketing systems and client operations. In this role, you will assist with outbound tools such as , , Heyreach, Apollo, and scraping tools, while also helping maintain trackers, organizing data, and supporting clients with operational and tool-related questions. You must have excellent written English and strong hands-on experience with Instantly.ai. This is a long-term position with consistent work. ⸻ Required Tools Experience (Mandatory) You MUST have hands-on experience with: • (very important) • • • • LinkedIn outbound workflows • Apify scrapers or similar scraping tools Applications without experience will not be considered. ⸻ Core Responsibilities 1. Campaign Operations • Monitor
I will perform a clean, accurate, one-time web scraping and structured data extraction of the public website table and deliver the complete dataset into a Google Spreadsheet with exact column order, row integrity, and zero data loss. Using a reliable scraping stack (Python, Requests, BeautifulSoup, or JavaScript if required), I will extract all visible table rows, normalize and validate the data, and transfer it into Google Sheets with proper formatting, structured schema alignment, and clean tabular organization. Key deliverables include: • Full table scraping from public website source • Accurate data extraction with 100% row and column preservation • Clean Google Sheets integration with correct structure and formatting &b...
Mam już w pełni działającego bota, którego głównym zadaniem jest monitorowanie dostępnych ofert na OLX i automatyczne wyszukiwanie tych, które spełniają określone kryteria. Chcę zarejestrować tę aplikację w Polskiej wersji portalu deweloperskim OLX, lecz mój pierwszy wniosek został odrzucony z powodu sugestii „Data Scraping/Competitor Monitoring”. Potrzebuję więc świeżego, zgodnego z polityką OLX opisu, który: • Spełnia wszystkie wymagania regulaminowe portalu i przechodzi proces weryfikacyjny bez zastrzeżeń. Oczekuję krótkiego, klarownego tekstu (PL + ewentualnie ENG), gotowego do wklejenia w formularz rejestracyjny oraz, jeśli to możliwe, wskazówek dotyczących dalszego procesu składania wniosku. Zależy mi ...
Webscraping of job portal (including detail pages) Main website: [ Fachangestellte/r (MFA)&job_location&radius=30&employer_type=['praxen','mvz']/]((MFA)&job_location&radius=30&employer_type=%5B%27praxen%27,%27mvz%27%5D/) Detail page (first result) Fields that I need in the Excel: - Title of each entry (job title) - Name of the company - Date - Address - Detail page: I need the full job desription in one field (can be html) This is about 1k results. What is the price and what would be the delivery time?
...Python Developer for US Data Pipeline and iOS Verification System (Phase 1) Project Description Suggestion: Overview: > We are looking for a senior Python developer to build an automated data scraping and iOS verification pipeline based in the US. The goal for Phase 1 is to acquire over 10,000 verified leads per day. Core Tasks: 1. Data Scraping: Extract data (name, phone number, age, gender, carrier) from US people search websites. 2. Anti-detection: Must integrate the API and set render=true and super=true. 3. Data Filtering: Implement automatic filtering by wireless/phone number and age range (50-90 years old). 4. Data Verification: Integrate the LoopLookup API to verify iMessage activation status. 5. Data Exp...
Hey! I’m looking to hire an experienced developer to build a universal product-detail scraping pipeline that takes a product URL (any website) and returns a complete structured product record. This is not a “simple HTML parse.” Many target sites are React/Next/Vue, load content via XHR/GraphQL, hide details behind tabs/accordions/modals, and lazy-load images/PDFs. The solution needs to reliably extract everything a human can see on the page, plus the underlying data used to render it. What the scraper must do (high level) Given a product URL, the pipeline should: Load the page like a real user (handle cookies/overlays). Capture all content from multiple sources (DOM + network + interactions). Use GPT API strategically to increase accuracy (field mappin...
...workflow. This is about extracting, comparing, and interpreting data. Excel and PowerPoint remain the source of truth. What we need: -Compare PowerPoint vs Excel and flag mismatches - Explain underwriting models and trace outputs - Compare legal/term sheets vs financial assumptions - Track document versions and changes - Summarize deal folders Automation goals: -Draft IC and board materials from templates -Standardize presentations and memos -Replace recurring analyst work -Produce management summaries -Highlight anomalies and trends - Market intelligence: - Build comp sets - Pull pricing and availability - Map assets and demand drivers - Provide macro context Mandatory data inputs: Internal databases and Excel Dropbox Web scraping Open to Claude, OpenAI, or...
...financial records from a set of business websites and turn them into a clean, structured dataset that my team can work with immediately. The job calls for a blend of precise web scraping and careful data entry so every figure—revenue, expenses, balance-sheet items, year-on-year comparisons—lands in the correct column and remains faithful to the source. Here’s what the work looks like from my side: • You’ll navigate each designated site, locate the target financial tables or statements, and pull every required number. • For transparency, I also want the source URL and the date you captured each record logged beside the data. • Consistency matters: please apply uniform naming conventions (e.g., “FY2023 Gross Profit&...
I need a clean, reliable web-scraping script built either in Node.js or Python. The goal is simple: pull fresh data every day and make it immediately available for display on my website. If you have questions about target sites, anti-bot measures, or preferred hosting, let me know and we’ll refine before you start.
Task : Extract Emails form a list of 65158 K Websites client will give a list of urls. This list is having 65158 urls. freelancer need to write script to fetch emails from those urls using automation web scraping techniques. Delivery time 5 days Delivery format : Data in Excel
I need a reliable solution that can pull data from LinkedIn and insert it straight into a database I specify. The core requirement is the automated transfer—once the tool finishes scraping, every captured field should already be sitting in the database ready for queries and reporting, no manual copy-paste. You’ll advise me on the best approach to authenticate, respect rate limits, and minimise the risk of blocks while still collecting the typical profile-level details (name, headline, company, location, experience, education, skills and anything else you can legally obtain). I will confirm the final field list before you begin. Key objectives • Build or configure a scraper / API wrapper that logs in, navigates to each target profile and captures the agreed-...
...an experienced AutoHotkey (AHK) developer to build a clean, reliable script that automates the repetitive navigation and clicking I perform every day inside my web application. Here’s the core scenario: the macro will launch a browser tab, step through a predictable series of pages, click specific buttons or links, wait for elements to load, and continue until the end of the workflow—no data scraping or form filling is required, just fast, accurate page-to-page movement and element selection. I’ll provide: • A screen-recording that shows the exact click path and timing cues • XPaths, CSS selectors, or unique element IDs where available • Any login credentials needed for testing (in a secure manner) You’ll deliver: &bu...
I have a list of real-estate agencies operating in Melbourne, Victoria and need every staff member’s direct phone number captured—agents, managers, administrative staff, everyone on the roster. You’ll work through th...number • One row per staff member; separate rows even if people share the same office line • A brief note/log for any agency where no direct numbers could be sourced despite reasonable effort Acceptance criteria • At least 90 % of listed staff have a direct number • Random sample of 20 entries must ring through to the named individual or their personal voicemail If you’re used to data-scraping tools but comfortable jumping on the phone to fill gaps, this should be quick work. Let me know your timefram...
I need a clean one-off scrape of tabular data that sits openly on a public website and have that entire dataset placed into a Google Spreadsheet. Because it is only a single extraction, I am not looking for a recurring script or scheduler—just an accurate pull of everything that appears in the table on the page today. Feel free to use your preferred stack—Python with BeautifulSoup/Requests, Apps Script, or any reliable web-scraping tool—as long as the final result lands neatly in the sheet, keeping the same column order and row count that appears online. Before we wrap up, I’ll quickly check row totals and a handful of random cells against the site to confirm accuracy; once those spot checks pass, the job is done.
I have a single public website that lists companies and I need their basic contact details pulled immediately. As soon as we agree, I’ll send you the URL; from there I expect yo...for each result—company name, address, website, phone number, and email—nothing more. The final deliverable is a clean, well-structured Excel file ready for me to review. Speed is the priority here: please be able to start right away and turn the file around as fast as possible while still double-checking that every row is accurate and complete. If this timeline works for you and you have solid scraping experience with tools like Python, BeautifulSoup, or Scrapy, let’s move forward now. Budget small as simple Task so Low budget bidder 1st priority. But start now. Simple Task. Star...
I have two source spreadsheets that I need merged and enriched through automated scraping: • “File 1” – 170 k Spanish local businesses with emails • “File 2” – 65 k additional businesses with websites only Phase 1 – Email extraction Using a Python script and well-known libraries (requests, BeautifulSoup, Scrapy or similar), scan every site listed in File 2, capture all working email addresses you can locate, then append them to the corresponding rows so I can produce a unified “File 3”. Phase 2 – Offer harvesting Next, visit each live site in File 3. Where an offer, deal or promotion is publicly displayed, record the details in a fresh Excel sheet with these exact columns: Business ID | Business Name ...
...processes such as: Automatically scraping a new client’s website and relevant public social profiles upon signup Structuring and exporting that data into organized files (Google Drive/Docs/Sheets) Creating standardized client folder structures in Google Drive Connecting onboarding forms to project management tools Automating internal task creation for our team Integrating AI tools (e.g. GPT workflows) into onboarding and research processes This is just the starting point, we want someone who can think strategically about workflow architecture, not just execute isolated zaps. Ideal Candidate Strong experience with tools like n8n, Zapier, Make (Integromat), or similar Comfortable working with APIs where needed Experience with web scraping tools and ...
I'm looking for help with manually sending and handling replies for 1000 cold messages daily via WhatsApp. Budget is $15 per 1000 per day for 20 working days, ie $300 / month. Scope of work - Send 1000 cold messages daily on WhatsApp manually from your own UK numbers. - Handle all replies and interactions from recipients. - Provide screenshots of all sent messages and replies. Additional information The task is ongoing, requiring daily manual effort. You must have experience handling bulk campaigns on WhatsApp. Please apply only if you have experience. Fixed price Up to US$300/month Hiring duration 6+ months
I need a data scraping expert to help generate leads from a list of websites. Requirements: - Scrape contact information, product listings, or user reviews (to be specified). - Work from a provided list of URLs. Ideal Skills: - Experience with data scraping tools and techniques. - Ability to handle multiple URLs and extract data accurately. - Attention to detail and reliability. Please share your portfolio and relevant experience.
...me how to run the script and change the target URL or output path if needed. Code quality matters to me: no hard-coded absolute paths, clear variable names, and graceful error handling so the run doesn’t stop if a single page fails. The entire job should fit comfortably within one to two days of focused work; total compensation is a fixed $40. If everything runs smoothly, I’ll have similar scraping mini-projects to pass along in the near future....
ob: Extract US Multi-Location Restaurant Brands from OpenTable I need a data researcher to extract restaurant brands from OpenTable () that meet the following criteria: Requirements US-based restaurant brands only Brands must operate between 15 and 50 total locations (company-wide, not just OpenTable listings) At least one location must be listed on OpenTable Must be an actual restaurant operator (no tech companies, media, associations, suppliers, or consultants) Deliverable A clean CSV file with the following columns: Brand Name Official Website URL Total Number of Locations (verified from website) Source URL confirming location count OpenTable URL (at least one listing link) Notes (if clarification needed) No duplicates. No single-site independents. No chains over
I have a backlog of paper-based invoices and receipts that must be keyed into an existing Excel template. Every figure needs to be captured exactly as it appears, with correct dates, vendor names, GST fields and reference numbers, so manual data-entry accuracy is critical. Because these records ultimately feed our Tally ledger, you should understand basic accounting concepts—debits, credits, tax codes—and be comfortable cross-checking your work against Tally reports to be sure totals match. No automated scraping is possible here; it is straight keyboard entry followed by a brief reconciliation step. Deliverable • Completed Excel workbook, fully populated and auto-sum balances matching the physical documents and my Tally control totals. Acceptance cri...
...related in the last month and note the information I specify below. Here is exactly what I want verified on every profile: • Recent posts activity – record the date of the most recent post so I can see at a glance who is active and who is dormant. • Availability of contact information – confirm whether an email, phone number, or “email” button is visible in the bio or contact section. No bots or scraping tools, please; I want a manual check for accuracy. Deliverables • The original Excel file returned with new columns for Last Post Date, Active Yes/No, and phone number still the same Yes/No, if it’s not the same, create a column next to the old number, & write the new number. • Highlight the ones not active anymore, meani...
Industrial Automation Product Data Extraction, Deduplication & Structured Image Collection Project Overview We are an industrial automation parts distributor building a structured product database to support inbound enquiries and SEO growth. We require an experienced data extraction specialist to: Extract structured product data from major industrial / electronic component distributor websites Identify duplicate manufacturer part numbers across multiple sources Merge all unique information into a single consolidated dataset Extract and organise all available product images per part number Deliver a clean, deduplicated, production-ready dataset This project includes: Data extraction Normalization Deduplication Intelligent merging Structured image...