Update documentation and ChangeLog
This commit is contained in:
		
							
								
								
									
										10
									
								
								ChangeLog
									
									
									
									
									
								
							
							
						
						
									
										10
									
								
								ChangeLog
									
									
									
									
									
								
							| @@ -1,3 +1,13 @@ | ||||
| v0.5 (25/05/2017) | ||||
| ** User ** | ||||
| 	Add --dry-run (-D) argument | ||||
| ** Dev ** | ||||
| 	Use cPickle instead of pickle | ||||
| 	Don't save all robots requests (only first pass is kept) which allow to save a large amount of memory/disk space | ||||
| 	Add one more rule to robot detection : more than ten 404 pages viewed | ||||
| ** Bugs ** | ||||
|  | ||||
|  | ||||
| v0.4 (29/01/2017) | ||||
| ** User ** | ||||
| 	Remove crawlers from feed parsers | ||||
|   | ||||
| @@ -11,7 +11,7 @@ Nevertheless, iwla is only focused on HTTP logs. It uses data (robots definition | ||||
| Usage | ||||
| ----- | ||||
|  | ||||
|     ./iwla [-c|--clean-output] [-i|--stdin] [-f FILE|--file FILE] [-d LOGLEVEL|--log-level LOGLEVEL] [-r|--reset year/month] [-z|--dont-compress] [-p] | ||||
|     ./iwla [-c|--clean-output] [-i|--stdin] [-f FILE|--file FILE] [-d LOGLEVEL|--log-level LOGLEVEL] [-r|--reset year/month] [-z|--dont-compress] [-p] [-D|--dry-run] | ||||
|  | ||||
|     -c : Clean output (database and HTML) before starting | ||||
|     -i : Read data from stdin instead of conf.analyzed_filename | ||||
| @@ -20,6 +20,7 @@ Usage | ||||
|     -r : Reset analysis to a specific date (month/year) | ||||
|     -z : Don't compress databases (bigger but faster, not  compatible with compressed databases) | ||||
|     -p : Only generate display | ||||
|     -d : Dry run (don't write/update files to disk) | ||||
|    | ||||
| Basic usage | ||||
| ----------- | ||||
|   | ||||
		Reference in New Issue
	
	Block a user