Collection of all my web scrapping projects that are pretty small.
-
- Update the README.md file, so it's more readable.
- logger function files are moved to the dependancy folder.
- All of the python files that uses the logger file has their import changed from "import logger" to "import dependances.logger as logger"
-
- Add Human readable date time to the second column of the csv file, given the first column is time in seconds.
-
- Patch - Fix a breaking import form logger.py. appending the dependances folder to the import path.
-
- added function log2csv() for use for all logging has try and except function
- added support for and only Windows notification for failed logging
-
- Use Logger.py function log2csv(), this decrease repition of code
-
- Use Logger.py function log2csv(), this decrease repition of code
-
- added support for and only Windows notification for failed logging
-
- added support for and only Windows notification for failed logging
-
- using API
- more efficient, using less CPU Time
-
- using Nicehash API
-
- The python code now runs more efficiently
- select the disk which the python file is on
- change directory to the folder with python file
- run the python file with the interpreter. {interpreter absolute path} .\{python file relative path}
e:
cd "E:\Personal - GDrive\Cron Jobs\Logging Crypto Prices - 2021.05(May). 24"
C:\Anaconda3\python.exe .\Log_Prices.py
Invisible CMD https://www.howtogeek.com/tips/how-to-run-a-scheduled-task-without-a-command-window-appearing/