Study of synthesis images noise detection using 26 attributes

Jérôme BUISINE 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
analysis cb6026f2c7 Add of 26 features metric il y a 4 ans
data_processing ebb14c3efc Add of RFE and SVM classifier il y a 4 ans
generate 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
modules @ cebf2adbf1 0a05939b74 Update of optimization process using backups il y a 4 ans
optimization @ db3538337f 0a05939b74 Update of optimization process using backups il y a 4 ans
prediction 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
.gitignore 0a05939b74 Update of optimization process using backups il y a 4 ans
.gitmodules b693a667bb Add of optimization modules il y a 4 ans
LICENSE dc0463b6b5 Project initialization il y a 4 ans
README.md 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
custom_config.py 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
data_attributes.py 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
find_best_attributes.py 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
find_best_filters.py 0a05939b74 Update of optimization process using backups il y a 4 ans
models.py 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
requirements.txt 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
train_model.py 30d257e0f7 update of the whole project to enable use of new dataset il y a 3 ans
train_model_attributes.py c70a55e2cc Add of attributes choice optimization scripts il y a 4 ans
train_model_filters.py 93cdadcec4 Add possibility to run opti solution il y a 4 ans

README.md

Noise detection using 26 attributes

Description

Noise detection on synthesis images with 26 attributes obtained using few filters.

Filters list:

  • average
  • wiener
  • median
  • gaussian
  • wavelet

Requirements

pip install -r requirements.txt

Project structure

Link to your dataset

You need database which respects this structure:

  • dataset/
    • Scene1/
    • Scene1_00050.png
    • Scene1_00070.png
    • ...
    • Scene1_01180.png
    • Scene1_01200.png
    • Scene2/
    • ...
    • ...

Code architecture description

  • modules/*: contains all modules usefull for the whole project (such as configuration variables)
  • analysis/*: contains all jupyter notebook used for analysis during thesis
  • generate/*: contains python scripts for generate data from scenes (described later)
  • data_processing/*: all python scripts for generate custom dataset for models
  • prediction/*: all python scripts for predict new threshold from computed models
  • data_attributes.py: files which contains all extracted features implementation from an image.
  • custom_config.py: override the main configuration project of modules/config/global_config.py
  • train_model.py: script which is used to run specific model available.

Generated data directories:

  • data/*: folder which will contain all generated .train & .test files in order to train model.
  • data/saved_models/*: all scikit learn or keras models saved.
  • data/models_info/*: all markdown files generated to get quick information about model performance and prediction obtained after running run/runAll_*.sh script.
  • data/results/: This folder contains model_comparisons.csv file used for store models performance.

License

The MIT license