Once the broken scrapers are fixed, the main ‘ask’ is to make it “user-specific” (ie each user can define their own list of stories and categories to include in the summary email). So the following need to be linked to the logged user:
At the moment each of these fields are attached to the news article so we need to create a User-News field relationship to achieve this. Does that make sense to you?
Complete
18.31
Other
Make the Economic Stats scrape time 1am and 8.30am
Cron does not include FirstFT
Add a button to test that the FT.com login is working
Full content button (FT) only shows in first block (unassigned)
Upon user login, run a check on the user's logins and passwords and determine access accordingly
Complete
18.02
Favicon
Not working, both in bookmark and in tab, but only when on the home page… Odd
Review of security (Dashboard -changed to ROLE_USER)
Complete
18.10
Read count
The new count needs to be user specific.
So this is the number of unread articles in the past 24 hours by user. Service by user and source
The read all button should not affect archives
Perhaps in archive you can have unread shown separately.
Complete
18.09
Removed the CMS and Settings entities
Why isn't LinkedIn password appearing on the list in a User's profile?
Complete
18.11
Pricing
The functions of the website are
LinkedIn scrape
News:
Single place to read articles
Summary access only
Hover for full article
Abilty to mark articles as read, to avoid re-reading
Abiliy to select key articles and send summary email
Mark economic stats as favourite - to generate an email
See what others are liking - are you missing an important well read article?
Summary Read-only
Full articles available via a link
One-stop Read access
However
Complete
18.32
Economic Market Statistics
Chron job. Refresh every 15mins
Button to refresh manually if >10mins
Historical view by stat
Complete
18.19
LinkedIn:
Popup that checks that the LinkedIn login and password are successful
The first result should return the number of connections and estimate the time to download them, before proceeding.
Advise the user how long it will take to download and that a file will be emailed to them
Email csv file - one-step directly after the scrape - ie save a file in the database at the end of the scrap and email (ie merge the 3 buttons we have)