Hi all!
I would like to start an open discussion on specific technical details for various components we need to implement or integrate with existing functionality. Let me now try to break down everything into digestible parts and hopefully we can devise an action plan to get things moving for each one:
2. Application Shell - While I am not fully confident yet, it seems the basic requirement is that we are able to interface with other content sources or apps and be able to connect to them through API or read data stored on device. There are a few technologies which we could use to build the app, all with their own benefits and drawbacks, which we will need to analyze. Here are a few approaches we could use:
a)
HTML - this is relatively simple, since we can use existing HMTL skills to build the framework. Generally HTML apps are smaller footprint and less graphic intensive, Facebook moved away from HTML a few years ago. I am pretty confident that creating the UI part will be nearly impossible using HTML app. Question:
can we interface the HTML shell with a Unity UI part? If we are going this route, we can use a cross-platform HTML framework such as
Apache Cordova.
b) Native Android Application - ultimately I think this is the route we will need to take as it will allow us much larger flexibility with integration of other components which may be developed by our partners.
c) Unity Shell - this may be just my ignorance here, but I assume it is also possible to use Unity to develop an application shell? If it is, then this could also be a good MVP approach since it will require less integration as we will use a common framework for all development.
3. Exploration UI - this is essentially going to a part of the Application Shell and given the high interactivity and dynamic vector based map generation, seems Unity would be a good choice to start.
4. User Profile - essentially, we are building and off-line LMS, we need to be able to display a simple screen show user's progress and assessment details. To prove our approach we will need to use xAPI to store and retrieve progress data from the profile.
5. Offline Content & Resource Library - we need to have the content which will be launched form the Exploration UI. There is a wide variety of content which will be made available such as videos, written content, apps encompassing one to multiple lesson units and complete lesson plans currently residing in various LMS platforms. The main agenda here is to take some of this content off-line and make it available through the app. Here are few examples of the type of content we should see in the app:
ELA - ???
Skills (Hygiene, etc) - ???
6.
Learning Tree & Content Metadata - this is going to be one of the core innovations of this platform, a comprehensive map of individual lesson units. I think we all recognize the complexity of this effort, it will be very hard to create a global map containing everything. My suggestion is we build on top of existing efforts (
Learning Registry and
Genome projects). We can also start by integrating common curriculum standards, such as CCSS or WNCP, and then build on top of it by adding other lesson types. I know I am generalizing, but that's the concept here. The other part of this is defining the Metadata standards, which are needed to accurately map each lesson unit to a piece of content. The
LRMI project is likely what we should utilize, basically it requires that educational content is marked up with metadata defined by them.
7. Encryption - for the purposes of MVP, we can utilize native encryption algorithms such as AES. We can look at other options once we start full app development.
8. Text to Speech, Browser, Media Player, Reader - these are all available as native Android applications and we can utilize them within our app without much additional effort.
Please treat this as an open ended conversation and comment or contribute in any way you can.