AAYUWIZ with Remote Patient Monitoring + Smart Rx + Labs + Referral Module

AAYUWIZ is a product of Sphata Systems. We are shaping up AAYUWIZ with added features to really become one single platform where it shall provide the entire range of solution for any healthcare units like 1. Private Providers, 2. Providers, 3. Labs, 4. Pharmacy, 5. Clinic, 6. Hospitals and more.

Here is the list of features we have added to Aayuwiz this year.

1. Remote Patient Monitoring

Remote Patient Monitoring is the module where we are trying to address the problem of continued care after diagnosis / visit. Most of the providers today wanted to monitor the patient continuously after the care. For that, we are devising our own embedded hardware that shall display using HDMI and collects data from various biometric sensors. Initially we are addressing sensors like

  1. Glucometer
  2. Pulse Meter
  3. Temperature
  4. Blood Pressure
  5. ECG
  6. EMG

and more.. The device connects to TV and IR module where it will collect data from the patient at house / home care / nursing home or any place of travel. The device again is thought to have a simple navigation like a TV channel where we will provide that as something like APPLE TV or any other micro computer like device.

The collected data then will be pushed through GSM / 3G / Internet to AAYUWIZ server. Then, Aayuwiz server will process the data applying all rules and then shall intelligently remind patient / doctors if there is any deviation from the norms. If its severe abnormality than usual range of value, the app immediately schedules appointment with doctors, facilitates doctors and patient to do video calls using our video calling facility in Aayuwiz.

This way, we collect data from patients at very much affordable price (pricing yet to be announced) for the device and then to connect the gap in continued care for a patient.

2. Smart Rx – Smart Medication Prescription Module

We call it Smart Rx because, the Rx prescription module of AAYUWIZ has few important features as follows

  1. Brand / Generic Medicine search & prescribe
  2. Preset of default Frequency & Duration for each medicine
  3. Detailed Drug information with Pricing
  4. Drug Allergy notification

As listed,  the prescription module has added with feature of Generic Medicine mapping along with its dosage value which allows the medicines to be searched either by brand name or by generic name. Each medicine shall be added with default options of frequency (say 1-0-1 a day) & duration (2 days) like that. This helps doctors to save time spent over system. Also, when searching the medicine, the detailed information is shown in a pane clearly to give doctor along with indication & contraindication to facilitate doctor to prescribe right medicine. Other important feature is Drug Allergy Notification where if a medicine has marked as allergic (in AAYUWIZ), the notifier alerts doctor if prescribed again.

3. Labs

Labs is another module that has been re-defined with lots of feature. One important thing that I would like to highlight today is networked diagnostic centers. This is a feature where the doctor after selecting labs to be tested, he shall also select a diagnostic center of his favorite or internally associated labs (with clinic or hospital) or even external diagnostic center. Also, the lab user shall login to AAYUWIZ where the user will be notified and he shall directly click on message and enter the lab data. The data will be added to patient record and again will be notified to doctors. That way, the whole lab system is networked.

4. Referral Module

Today, we are seeing the doctors are well-connected. They are referring their patients to other doctors in network based on specialty or critical of care. Here in Aayuwiz, we have made it a simple workflow where the doctor shall search a doctor in his network or hospital or by area and then refer the patient to his / her doctor. Once referred, the other doctor will get the access to patients record along with the care note given by referrer.

5. Search & Schedule

We have an external application as well as internal module where patient can go online, search for doctors for any given speciality, location or so and then shall schedule. Once done, the doctor will get a notification and shall be able to confirm the appointment. That app is also integrated together with scheduler module of AAYUWIZ.

Quick Demo Video:

Ultimately, what we are trying to address is some of the common issue that we see in connecting various solutions together under one platform. That is AAYUWIZ!!!  Expect more posts in near future with updated and added modules…

Loading Maven dependencies for OSGi Bundles – Apache Solr / Lucene in Apache Karaf

Hi everyone,
Today its a great learning for me and my team trying to build Apache Solr within Apache Karaf. I am gonna talk about missing dependencies issue related with Karaf using Felix bundling tool in maven. This is quite common and I took example of Solr for here to explain you all.

What is Apache Karaf?
Apache Karaf is a small OSGi based runtime which provides a lightweight container onto which various components and applications can be deployed.

What is Apache Solr?
Solr is the popular, blazing fast open source enterprise search platform from the Apache LuceneTM project. Its major features include powerful full-text search, hit highlighting, faceted search, near real-time indexing, dynamic clustering, database integration, rich document (e.g., Word, PDF) handling, and geospatial search.

Creating Client Project using SolrJ
Solrj is a java client to access solr. It offers a java interface to add, update, and query the solr index. For adding this feature to my karaf container, we created a pom.xml with minimal dependency of

Problem
<!-- SOLR CORE AND ITS DEPENDENCIES -->
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-core</artifactId>
<version>4.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>4.3.1</version>
</dependency>

Then when deploying in Karaf it showed some error when you read headers
karaf@root> headers <START_ID>

Reason
It will show you some jars and dependencies highlighted in red which means that we failed to deploy them in maven pom.xml. For this, we tried including all the dependencies that were missing and its like a fission effect where the list of missing dependencies started growing more and more. For this, there is a simple approach that will eliminate the missing dependency issue. Here is the sample of felix definition in our pom.xml

Solution
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<executions>
<execution>
<id>bundle</id>
<phase>package</phase>
<goals>
<goal>bundle</goal>
</goals>
</execution>
</executions>
<configuration>
<instructions>
<Bundle-Name>BUNDLE NAME</Bundle-Name>
<Bundle-Description>This bundle will be used as a common registry
service across the platform</Bundle-Description>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
<Import-Package>${bundle.import.package}</Import-Package>
<Export-Package>${bundle.export.package}</Export-Package>
true</Embed-Transitive>
*;scope=compile|runtime;</Embed-Dependency>
<DynamicImport-Package>${bundle.dynamicImport.package}</DynamicImport-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>

Having the property defined in real world as


<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<bundle.import.package>com.**.**,*</bundle.import.package>
<bundle.private.package>com.**.**</bundle.private.package>
<bundle.export.package></bundle.export.package>
<bundle.dynamicImport.package>
org.apache.log,
all.missing.dependencies
</bundle.dynamicImport.package>
</properties>

Sample
Now we need to replace all.missing.dependencies with all the jars that are listed as red in karaf. In my case for Apache Solr, I added the below given list

<bundle.dynamicImport.package$gt;
com.sun.jdmk.comm,
com.sun.management,
com.sun.msv.grammar,
com.sun.msv.grammar.trex,
com.sun.msv.reader,
com.sun.msv.reader.trex.ng,
com.sun.msv.reader.util,
com.sun.msv.verifier,
com.sun.msv.verifier.regexp,
com.sun.msv.util,
com.vividsolutions.jts.algorithm,
com.vividsolutions.jts.geom,
com.vividsolutions.jts.io,
com.vividsolutions.jts.operation.union,
com.vividsolutions.jts.operation.valid,
com.vividsolutions.jts.simplify,
com.vividsolutions.jts.util,
javax.jms,
javax.portlet,
org.apache.log,
org.jboss.netty.bootstrap,
org.jboss.netty.buffer,
org.jboss.netty.channel,
org.jboss.netty.channel.group,
org.jboss.netty.channel.socket.nio,
org.relaxng.datatype,
org.relaxing.datatype,
sun.misc
</bundle.dynamicImport.package$gt;

Reasoning
The magic here that we added all the missing / unavailable dependencies as dynamic import to the Felix builder, that will create a manifest in the jar keeping them in dynamic import section. This will tell Apache Karaf container that the jars / packages are not needed to be part of bundle. This way, you shall eliminate all the missing dependency issue with the classes that need not be bundled along with the core jars.

Here is my Full POM file
<?xml version="1.0"?>
<project
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"
xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<modelVersion>4.0.0</modelVersion>
<parent>
<groupId>com.kousik.package</groupId>
<artifactId>yourartifact</artifactId>
<version>0.5</version>
</parent>
<groupId>com.package</groupId>
<artifactId>artifact-impl</artifactId>
<version>0.5</version>
<name>artifact-impl</name>
<url>http://maven.apache.org</url&gt;
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<bundle.import.package>com.package.api,*</bundle.import.package>
<bundle.private.package>com.package.impl.ClassImpl</bundle.private.package>
<bundle.export.package></bundle.export.package>
<bundle.dynamicImport.package>
com.sun.jdmk.comm,
com.sun.management,
com.sun.msv.grammar,
com.sun.msv.grammar.trex,
com.sun.msv.reader,
com.sun.msv.reader.trex.ng,
com.sun.msv.reader.util,
com.sun.msv.verifier,
com.sun.msv.verifier.regexp,
com.sun.msv.util,
com.vividsolutions.jts.algorithm,
com.vividsolutions.jts.geom,
com.vividsolutions.jts.io,
com.vividsolutions.jts.operation.union,
com.vividsolutions.jts.operation.valid,
com.vividsolutions.jts.simplify,
com.vividsolutions.jts.util,
javax.jms,
javax.portlet,
org.apache.log,
org.jboss.netty.bootstrap,
org.jboss.netty.buffer,
org.jboss.netty.channel,
org.jboss.netty.channel.group,
org.jboss.netty.channel.socket.nio,
org.relaxng.datatype,
org.relaxing.datatype,
sun.misc
</bundle.dynamicImport.package>
</properties>
<dependencies>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>3.8.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.package.api</groupId>
<artifactId>search-api</artifactId>
<version>0.5</version>
</dependency>
<!-- SOLR CORE AND ITS DEPENDENCIES -->
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-core</artifactId>
<version>4.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.solr</groupId>
<artifactId>solr-solrj</artifactId>
<version>4.3.1</version>
</dependency>
<dependency>
<groupId>commons-logging</groupId>
<artifactId>commons-logging</artifactId>
<version>1.1.1</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.5.6</version>
</dependency>
<dependency>
<groupId>org.apache.lucene</groupId>
<artifactId>lucene-sandbox</artifactId>
<version>4.0.0</version>
</dependency>
</dependencies>
<build>
<plugins>
<plugin>
<groupId>org.apache.felix</groupId>
<artifactId>maven-bundle-plugin</artifactId>
<version>2.3.7</version>
<extensions>true</extensions>
<executions>
<execution>
<id>bundle</id>
<phase>package</phase>
<goals>
<goal>bundle</goal>
</goals>
</execution>
</executions>
<configuration>
<instructions>
<Bundle-Name>Project Impl</Bundle-Name>
<Bundle-Description>This bundle will be used as a common registry
service across the platform</Bundle-Description>
<Bundle-SymbolicName>${project.artifactId}</Bundle-SymbolicName>
<Import-Package>${bundle.import.package}</Import-Package>
<Export-Package>${bundle.export.package}</Export-Package>
true</Embed-Transitive>
*;scope=compile|runtime;</Embed-Dependency>
<DynamicImport-Package>${bundle.dynamicImport.package}</DynamicImport-Package>
</instructions>
</configuration>
</plugin>
</plugins>
</build>
<repositories>
</repositories>
</project>

Also you shall download from here pom.xml
Now that we know the technique, we started usign the same approach for other services too.

Thanks,
Kousik Rajendran.

Service Wrapper for Play Framework

Today, I would like to share one of the script that runs Play Framework Projects as service in linux. We as administrators are always in need to make any of the projects a typical service that can be handled by executing commands like “service play start”, “service play stop” or anything that runs Play Framework in background (silent mode) as service in linux.

What is Play?

Play Website says: “Play is a high-productivity Java and Scala web application framework that integrates the components and APIs you need for modern web application development. Play is based on a lightweight, stateless, web-friendly architecture and features predictable and minimal resource consumption (CPU, memory, threads) for highly-scalable applications thanks to its reactive model, based on Iteratee IO.”

Why Play?

In Sphata Systems, we are developing a platform “HUB” that is purely a cloud friendly platform. Before even we stared thinking about the design, first we started understanding the problem of any application that is / to be hosted in cloud if its scalable, flexible, modularized and so on. We clearly understood the demand of any cloud application and we started dismantling each and every block of the system in such a way that each block / module shall be treated as standalone, scalable services that runs and grows on its own. This design enables us to upscale or downsize the computing capacity of the instance that hosts any service in cloud. We in Sphata are trying to come up with a patent-able approach to design a cloud based platform.

For highly demanding design like this, we were in need of a front-end web container that has the capacity for distribution in cloud and flexible enough for us to develop and manage within our custom containers. For this, we chose “Play Framework”. We use Play to manage all VIEWS / GUI that interacts directly to our service containers via web calls.

Need for Startup Scripts:

Play Framework supports CI greatly and even though there are quite a lot of options in play to package and build for production deployment, there is no such great service wrappers in built. But, as administrators, we will be in need of a startup script that starts the project and run silently. Also, we should be in a position to stop, start and manage like any other service in linux. Even though there are lot of scripts available in forums to automate Play Project to run as service in Linux, I took one good example from Play’s google groups and slightly customized to fit my needs. This works fine for me in my cloud instance.

Startup Script to run Play as Service
Just copy this code and make a file as “play” and save it to “/etc/init.d” and then do “chmod +x play”. Now the script is ready to run as executable service for you.

#!/bin/bash
# chkconfig: 345 20 80
# description: Play start/shutdown script
# processname: play
#
# Installation:
# copy file to /etc/init.d
# chmod +x /etc/init.d/play
# chkconfig --add /etc/init.d/play
# chkconfig play on
#
# Usage: (as root)
# service play start
# service play stop
# service play status
#
# Remember, you need python 2.6 to run the play command, it doesn't come standard with RedHat/Centos 5.5
# Also, you may want to temporarily remove the >/dev/null for debugging purposes

# Path to play install folder
PLAY_HOME=/var/play
PLAY=$PLAY_HOME/play

# User running the Play process
USER=root

# Path to the application
APPLICATION_PATH=/var/play/helloworld

# source function library
. /etc/init.d/functions
RETVAL=0

cd ${APPLICATION_PATH}

start() {
echo -n "Starting Play service: "
su -s /bin/sh $USER -c "target/start >/dev/null &"
RETVAL=$?

if [ $RETVAL -eq 0 ]; then
echo_success
else
echo_failure
fi
echo
}
stop() {
su -s /bin/sh $USER -c "kill -9 `cat ${APPLICATION_PATH}/RUNNING_PID` >/dev/null"
su -s /bin/sh $USER -c "rm -rf ${APPLICATION_PATH}/RUNNING_PID >/dev/null"
# su -s /bin/sh $USER -c "play clean stage >/dev/null"
echo -n "Stopping Play Application: ${APPLICATION_PATH}"
RETVAL=$?

if [ $RETVAL -eq 0 ]; then
echo_success
else
echo_failure
fi
echo
}
status() {
su -s /bin/sh $USER -c "${PLAY} state"
RETVAL=$?
}
clean() {
su -s /bin/sh $USER -c "rm -f ${APPLICATION_PATH}/server.pid"
su -s /bin/sh $USER -c "rm -f ${APPLICATION_PATH}/RUNNING_PID"
}
case "$1" in
start)
clean
start
;;
stop)
stop
;;
restart|reload)
stop
sleep 10
start
;;
status)
status
;;
clean)
clean
;;
*)
echo "Usage: $0 {start|stop|restart|status}"
esac
exit 0

After copying the above script to /etc/init.d/ you shall execute as

# chmod +x /etc/init.d/play
# service play start
....
# service play stop

Note: You need to customize this script based on where you installed Play and where your project is. Otherwise, this script should be able to handle the play as service.

How to use this script for multiple play projects?
This is quite simple as all you have to do is to just clone this service file and then rename to corresponding play_<project_name> and then edit the file with respective path to that project. Then start it as a different service.

Follow

Get every new post delivered to your Inbox.

Join 598 other followers

%d bloggers like this: