This commit adds the itertools crate which is used to dedup the Vec
when downloading urls
fix: fix error message
feat: change the serif and mono fonts declarations
- refactor comments
- move `cli::Error` to `errors::ErrorCli`
- removed mixing of order of input urls
- move pure functionality if `init_logger` to clear function
- Add ReadabilityError field
- Refactor `article` getter in Extractor to return a &NodeRef. This
relies on the assumption that the article has already been parsed
and should otherwise panic.
- Map errors in `fetch_html` to include the source url
- Change `article_link` to `article_source`
- Add `Into` conversion for `UTF8Error`
- Collect errors in `generate_epubs` for displaying in a table
Using this custom error type, many instances of unwrap are replaced
with mapping to errors that are then logged in main.rs. This allows
paperoni to stop crashing when downloading articles when the errors
are possibly recoverable or should not affect other downloads.
This subsequently introduces ignoring the failed image downloads
and instead leaving the original URLs intact.