The Scraper buttons
- Currently the code tracks the status - basic details scraped, then full scrape. I think we need to consider how we refresh the basic scrape (check # connections count?) to see if a new scrape is required
So how should it work
- First time use:
- Hide “Import and Email button”, and “Export” and "Email" buttons
- Hide Connections Scraped info button
- Scrape button should show: 1) Import Connections.
- This should popup the following text box: “Step 1. Our servers will now login to Linkedin and pull in the basic details of all your [1,234] connections. …..”
-
I have a system that allows users to scrape their LinkedIn connections and export the results as a VCF or XLS. The result is that their LinkedIn connections are in their contact lists, on their phone etc. Technically it works by initially scraping the basic details of each of the user's connections and then subsequently the server opens each connection and scrapes all the details in full (education, employment, languages etc). The first process takes a few minutes only. The second process can take hours or days depending on the number of connections. To avoid the anti-scraping policies we have to do them in batches with a delay between each batch, hence it can take days.
I need help to do the following: Design buttons that logically walk a new user through this process. The buttons can appear/disappear as appropriate. What should the buttons do? So my thoughts were that the Scrape button should initially say “Scrape” and it will popup with the following message “Step 1. Our servers will now login to LinkedIn and pull in the basic details of all your [1,234] connections. This will take a few minutes. After that process you will be able to see the basic details (name, photo, current position) of each of your connections in the index here. However, the ”full scrape" (resulting in Employment, Education, and other details) will take time (about [3] days), as it is done in batches. Once complete we will email you to let you know (on xxxx@xxx.com) and you can view them all in the index and download VCF for use in Outlook or XLS for more detailed analysis. You can login and check at any time the progress. Would you like us to email you progress reports or simply when complete?"
I was wondering if i should then show a second button “2) Initiate full connection-level scrape” or whether I should just go ahead and initiate the process. I also have a display that tells them progress so far. Do you agree with that approach or how can you make it easier. I also need to build a process to update the download - how should i handle that?