Design and Implementation of a System to Detect Severity of Dementia
The proposed methodology is to explore Hadoop techniques like MapReduce, which will help to process the data faster and in efficient way to detect dementia. We have a big dementia dataset, in many of the existing solutions the data partitioning techniques gets complicated because of the redundant transactions among nodes. By using Min-Hash and LSH Based Partitioning using the MapReduce programming model we solve this problem. We implement this technique on Hadoop with dementia dataset.