Installation
This Page Explains Steps For Local Installation of Assessment Service
Last updated
This Page Explains Steps For Local Installation of Assessment Service
Last updated
Runtime Environment:
Java: Java 8 (for Neo4j & Cassandra)& Java 11 (for Service)
Scala: 2.11
Databases:
Neo4j : 3.3.0
By default, Neo4j requires authentication (user_name & password). Authentication should be disabled.
Apache Cassandra: 3.9
Redis: 4.0+
Build Tool:
Maven: 3.3.0+
Source Code Management Tool:
Git
Create a directory with name inquiry-service and switch to the directory.
Question & QuestionSet Service uses knowlg-core module from Knowlg BB. So we need to clone two repo's inside folder created above
Clone the Repository (https://github.com/Sunbird-Knowlg/knowledge-platform.git) using the below command.
Based on release version, Please checkout to specific tag for knowlg-core module. For Tag information, Please refer to Core Release Tag Section of Assessment Service under Release Notes.
Clone the Repository (https://github.com/Sunbird-inQuiry/inquiry-api-service.git) using the below command.
Based on release version, Please checkout to specific branch or tag using below command
Navigate to inquiry-service/knowledge-platform folder and build knowlg-core module using below command
Navigate to the inquiry-service/inquiry-api-service folder and build the entire code base from the location
You need to build entire code base, if installing for first time.
Assessment Service can be located in path (inquiry-api-service/assessment-api) and you can build the service only from 2nd time onwards.
Step 3:
Create Required Cassandra DB Schemas using cqlsh. Assessment Service Requires below Keyspaces and Tables:
hierarchy_store
questionset_hierarchy
question_store
question_data
category_store
category_definition_data
To create above keyspaces and tables, Please use below scripts:
For DB Schema/Script, You can also visit below link:
Step 4:
Create required Primary Category (e.g: Practice Question Set, Multiple Choice Question) & Its corresponding category definition using taxonomy-service.
taxonomy-service is a micro-service from Knowlg BB.
Primary Category is a mandatory property for creating any object (Question, QuestionSet) using assessment-service. Sunbird has a set of predefined Primary Categories and its definitions. Users can also create their own Primary Category and its definition using taxonomy-service.
For Question & QuestionSet below Primary Categories can be used:
Practice Question Set
QuestionSet
Curiosity Question Set
QuestionSet
Multiple Choice Question
Question
Subjective Question
Question
FTB Question
Question
Sunbird Primary Category Curls can be found here:
Sunbird Primary Category Definition Curls can be found here:
Step 5:
Modify the application configuration (inquiry-api-service/assessment-api/assessment-service/conf/application.conf) and do the maven build using maven command (mvn clean install -DskipTests) from assessment-service folder location.
For Configuration details, Please Refer to Configuration Page.
Step 6:
Update Object Schema If Required. Object Level Schema is available under path inquiry-api-service/schemas
For Detailed Schema, Please Refer to Schema Page.
Step 7:
Run the service from assessment-service folder location using below command:
Above command will make service available at default port (9000) but won’t be initialised.
In order to initialize the service, we should make the 1st call to the service. For Example, health api can be invoked for the same.
Curl for Health API is as below
Now Service is Up and Running. You can try available endpoints.
Available endpoints can be checked in inquiry-api-service/assessment-api/assessment-service/conf/routes file.
To Run Service in Debug Mode, below command can be used
Above command will make service available at default port (8000) but won’t be initialized. Service initialization starts when the remote debugger starts and gets connected at 8000 port.
API’s Specification is available here
Assessment Service depends upon an async job async-questionset-publish for completion of publish operation of the object.
For local setup and try out api’s, it's not mandatory to have this flink job because publish api just sends the event to the configured kafka topic and the backend job takes events from kafka topic and performs further operation in async mode. So even if the job is not available, publish api sends the event and returns 200 response.
async-questionset-publish code base is available in below repository.