0.0.1 • Published 4 years ago

curriculum-importer-core5 v0.0.1

Weekly downloads
-
License
UNLICENSED
Repository
-
Last release
4 years ago

tool-curriculum-importer-core5

Codebase for the curriculum Importer which is a program that imports the curriculum from json/csv formatted files into the mysql database.

Usage

Instructions to install and run the Core5 Curriculum Importer

Development Infrastructure

This is a NodeJs application implemented using typeScript that can be run from main.js as a script or as a Express application.

Repository Link

Curriculum Importer Repository Link

Installation

In this package:

mkdir project
cd project
git clone https://stash.trstone.com/projects/LXJS/repos/tool-curriculum-importer-core5.git
cd curriculum_importer/app
npm install
npm run build

Logging

curriculum_importer/app/curriculum_importer.log

Check the status of the last import.

npm run status:dev160 

Run the curriculum Import in dryrun mode

npm run dryrun:dev160

Clear the hashes from the database to force a repopulate *(Note: only clears the key/values of hashes in system_properties. No content is removed)**

npm run clear:dev160

Run the curriculum Import

npm run import:dev160

Dry Run SQL Location

curriculum_importer/app/sql/core5/update_<curriculumType>_<version>.sql

    <curriculum_type> - content, index, kinds, unitKindsLessonsMap
    <version>         - 18, 21

>ls sql/core5
update_content_18.sql			update_index_18.sql			update_kinds_18.sql			update_unitKindsLessonsMap_18.sql
update_content_21.sql			update_index_21.sql			update_kinds_21.sql

Technical Notes

Core5 Curriculum Importer fetches json from the remote server passed in (ie.qa-master):

    https://qa-master.lexiacore5.com/build.json
    https://qa-master.lexiacore5.com/build.properties
    https://www.lexiacore5.com/version/4.2.0/master/825/db_update/update.json

Input files are found in:

    https://www.lexiacore5.com/version/4.2.0/master/825/db_update/update.json

There are 2 versions of Core5 curriculum content in update.json

    version 3 - (18 level)
    version 4 - (21 level)

The last version successfully uploaded is stored in the following key/value in lexia.system_properties. (Note: this functionality was preserved from prior process)

Database KeyValuePurpose
core5ContentVersion4.2.0.862Last version string updated from the build.json file.

Checking the upload Status and Plan

    1. The first thing that the system does is check the global hash on the build.json file.
    2. Compare it to the database hash in system_properties.
       if system_properties key CORE5_ALL_CONTENT_UPDATED matches the global hash, there is no need
       for an update. The user is informed and the update does not occur.
    3. Each type of content has a json file and schema file and a matching key in the systems_property table.    

Description of Content Status Keys in lexia.system_properties Table and JSON

Database KeyHash for FilePurpose or tables updated
CORE5_ALL_CONTENT_UPDATED_PROPERTY/build.jsonGlobal Hash, The database hash holds the last updated hash.
CORE5_INDEX_21/../v4/program_index_21.jsoncore5 schema: curriculum_version, activity_level, activity_task, level, activities and task.
CORE5_INDEX_18/../v3/program_index_18.jsoncore5 schema: curriculum_version, activity_level, activity_task, level, activities and task.
CORE5_KINDS_18/../v3/kinds.jsoncore5 schema: kinds
CORE5_KINDS_21/../v4/kinds.jsoncore5 schema: kinds
CORE5_CONTENT_18/../v3/content.jsonstudent schema: content -> trigger (UPDATE/INSERT) updates lexia schema content
CORE5_CONTENT_21/../v4/content.jsonstudent schema: content -> trigger (UPDATE/INSERT) updates lexia schema content
CORE5_UNITKINDSLESSONSMAP_18/../mapUnitKindsToLessons.csvcore5 schema: kinds_translation lexia schema: ancillary_lat_ref

For each of the above Upload Types the following steps occur:

1. The system checks the value of the last upload to the database for that particular file. 
    a. If it hasn’t changed no update occurs. 
    b. If it has then the update occurs.

2. If the update occurs for that given data type.
    a. The new importer validates the schema before uploading.
    b. If successful it reads in the json and generates sql statements for the tables listed above.
    c. The queries are executed in batches and return values from the sql queries are validated and verified. 
    d. If any errors occur, an exception is thrown for that particular data type and it is logged to the file. 
    e. If the import is succesful, the updated content hash is written to the database.
    f. The only time the global content hash and core5ContentVersion are updated is when all hashes are up to date. (checks each time a content type completes to see if it is the last)
    g. The INDEX type has dependencies. If this type is changed changed, the hashes for the content types that depend on the index get deleted so they are also regenerated (dependencies on index).

Other features:

1. The system checks for the size of character strings during this process and if there is an error, it is detected. 
2. The system checks the table size in the beginning and validates the outcome of the process has at least that the original table size.

Author

Mary Green White

Check package.json for list of packages that where used as well.

Acknowledgments

  • Neil Robbins for inspiring and assisting in this project.

  • Rick Beyer asking me to join in PI10 IP Sprint and assisting in the NodeJS learning process.

  • Tom Grayson for inspiring me to use typeScript and his help along the way especially in creating innovative queries for content which minimize updating of rows in the database.

  • Miriam Fein-Cole for giving the ok to turn the PI11 IP Sprint project into a real project.

  • Jorge Gonzalez for providing direction on the journey.

  • Doug Swanson, Chris Phillips, Paul Wilson, Greg Carriveau for all of the great help along the way and the final "operationalizing" of it.