Madrid Laboratorio Urbano: taller de prototipado en Medialab Prado

Madrid_laboratorio_urbano

Medialab Prado ha lanzado la convocatoria del Taller de prototipado colaborativo y simposio internacional Madrid Laboratorio Urbano dedicado a las Infraestructuras, prácticas y herramientas para repensar la vida en común:

Madrid Laboratorio Urbano propone explorar la relación entre ciudad, cultura digital y procomún a través de un programa que incluye un taller de producción colaborativa y una serie de conferencias y debates en los que participan invitados internacionales y proyectos locales, que plantean una reflexión acerca del significado de lo procomún y la cultura digital en la evolución de la ciudad.

La convocatoria para envío de propuestas está abierta hasta el 4 de mayo, y los talleres se desarrollarán dos fases en Mayo y Octubre. Se seleccionarán 15 proyectos que aborden temas relacionados con el espacio público e infraestructuras abiertas, la ciudad inclusiva y la ciencia ciudadana, y se desarrollarán en equipo siguiendo la metodología de Medialab Prado. Participo junto con Antonio LafuenteJosé Luis de Vicente como asesor y en mi caso como facilitador de los proyectos espacios público e infraestructuras.

Este es el texto que hemos preparado como introducción y contexto para el taller:

Introducción

Madrid Laboratorio Urbano propone explorar la relación entre ciudad, cultura digital y procomún a través de un programa que incluye:

  • Un taller de producción colaborativa en el que tomando como base proyectos locales ya en marcha se diseñen y prototipen nuevas herramientas, plataformas y acciones. Este taller se desarrollará siguiendo la metodología habitual de Medialab-Prado: convocatoria abierta para la selección de propuestas, inscripción de colaboradores interesados, formación de grupos interdisciplinares de trabajo y desarrollo de prototipos en un proceso apoyado por facilitadores y asesores de diferentes ámbitos.
  • Una serie de conferencias y debates en los que participan invitados internacionales y proyectos locales, que plantean una reflexión acerca del significado de lo procomún y la cultura digital en la evolución de la ciudad.

Contexto

  • Vivimos una época convulsa en la que se mezclan crisis y transformación, la obsolescencia de diversas prácticas y valores sociales, políticos o económicos, y la emergencia de otros, de forma caótica pero empezando a dibujar nuevos modelos. La ciudad, como ningún otro, es el principal escenario de este proceso. Y Madrid se está configurando como un laboratorio no planificado de proyectos ciudadanos de todo tipo que exploran las posibilidades y los límites de la nueva cultura social, y en particular de la cultura abierta / libre, como configuradora del espacio público y de las relaciones sociales, económicas o políticas que genera.

Así en Madrid nos encontramos con movimientos sociales que toman la bandera de las más diversas reivindicaciones, colectivos de amateurs y profesionales que investigan desde la acción sobre el espacio público y la participación ciudadana, o diversos laboratorios ciudadanos donde se congregan comunidades que desde el hacer reflexionan sobre la ciudad o la economía. Y todo esto sucede mientras lo público y lo privado, la administración y las empresas, tratan de descubrir sus razones de ser e incluso en algunos casos intentan reconvertirse, ya sea por necesidad o por oportunidad, en plataformas que generen comunidades o que se inserten en las ya existentes.

Si algo caracteriza este proceso es su aparente, y puede que real, caos. La dificultad para aprehender su significado profundo o para entender sus causas últimas. Por supuesto menos previsibles son aún sus consecuencias en el medio y largo plazo. De este modo, este escenario provoca emociones encontradas, que van desde el pánico y la ausencia de esperanza hasta la ilusión por un posible cambio real. Y en este choque emocional nos encontramos todos: los agentes del viejo sistema y los activistas del que podría ser nuevo; los marginados y los que toman las decisiones …

Si algo podría aglutinar y dar cierto sentido a lo que sucede es el propio concepto de procomún. Redescubrimos ahora el poder de lo que se encuentra, por utilizar una definición negativa, entre lo público y lo privado. Ese enorme espacio donde se desarrollan nuestras vidas y que durante décadas se censuró y creímos ya inexistente. Solo si ahora nombramos ese espacio y entendemos sus propiedades podemos empezar a entender estos fenómenos emergentes. Y solo entendiendo que es y como funciona el procomún podemos comprender su dinámica caótica fruto directo de un sistema con una gobernanza compleja, diversa y sofisticada que tiene poco que ver con los sistemas políticos y corporativos clásicos.

El otro concepto que puede ayudarnos a entender el presente es el de cultura digital. La tecnología nos ha empoderado al convertirse en una infraestructura relacional y de conocimiento básica. Pero su papel no es solo instrumental, por muy relevante que éste sea. Alrededor de la tecnología llevan décadas trabajando comunidades especialmente activas que han consolidado nuevas prácticas y valores que hasta hace poco se consideraban marginales, quizás incluso peligrosas. Pero hoy en día lo abierto, la transparencia o la colaboración son ya referentes a los que intentan abrazarse instituciones y empresas que tratan de incorporarse y hasta apropiarse, para bien y para mal, a esta nueva realidad.

Madrid como laboratorio: Madrid es uno de los escenarios más relevantes en el encuentro de la ciudad, la cultura digital y el procomún. Un laboratorio emergente de innovación ciudadana y gobernanza. Medialab Prado ha contribuido a esta exploración convirtiéndose en uno de los generadores de comunidades de la ciudad, además de reflejar en si mismo las contradicciones y oportunidades del nuevo escenario al ser una iniciativa pública desde la que funciona un laboratorio del procomún y donde se organizan comunidades de amateurs y profesionales.

El taller internacional Madrid Laboratorio Urbano propone explorar la relación entre ciudad y procomún utilizando los enfoques y metodologías de los proyectos Interactivos? y Visualizar que vienen desarrollándose en los últimos años. ¿Qué aportar desde Medialab a una temática que ya cuenta con un gran desarrollo en la ciudad de Madrid? El objetivo final es generar un espacio de colaboración donde puedan encontrarse los proyectos locales e iniciativas internacionales para diseñar y prototipar nuevas herramientas y experiencias y, de ese modo, propiciar un debate y reflexión sobre el significado de lo procomún y la cultura digital en la evolución de la ciudad. En este sentido la propuesta se centra en buscar las sinergias entre las diversas iniciativas ya existentes y en reflexionar sobre las consecuencias de estos procesos.

Vote for the Top 100 Tools for Learning 2013

Below is an extract from the original post on the Centre for Learning & Performance Technologies blog

Vote for the Top 100 Tools for Learning 2013

top100 The annual Top 100 Tools for Learning list has become very popular. The 2011 list has now been viewed over 880,000 times (on Slideshare), and  the 2012 list over 550,000 times (on Slideshare). The list was also cited in KPCB’s 2013 Internet Trends presentation (viewed over 2.3 million times) 

Voting for the Top 100 Tools for Learning 2013 –   the 7th Annual Survey – is currently underway. The list will be compiled from the votes of learning professionals worldwide.

Voting closes at midnight GMT on Friday 27 September 2013, and the Top 100 Tools list will be revealed on Monday 30 September 2013.

What is a learning tool?

A learning tool is a software tool or online tool or service

  • either for your own personal or professional learning
  • and/or one for teaching, training or e-learning

Voting guidance

  • You will need to name ten different tools for your entry to count.
  • Please name ten specific tools rather than generic technologies (e.g. “Blogger” or “WordPress”, rather than “blogs” or “blogging”).
  • A vote for Google will be a vote for Google Search, so if you want to vote for other Google tools, you will need to name them individually, e.g. Google Docs/Drive, Google Scholar, Google Maps, etc. A vote for Google Apps will be for the badged collection of Google Apps for Business or Education,  not a generic term for all Google applications.
  • A vote for Microsoft Office will be split over Word and PowerPoint, so please vote individually for these tools, i.e. Word, Powerpoint, Excel, Outlook etc.
  • Only one contribution can be accepted from a vendor company.
  • Finally, contributions are accepted only at the discretion of the Centre for Learning & Performance Technologies.

How to vote

EITHER use the form on this page to vote for your top 10 tools for learning.  For each tool that you name you can optionally add any information about how you use it or why you like it. If you are happy for us to quote your choices (and reasons) publicly at any time, please check the box at the bottom of the form, otherwise your selection/reasons will remain anonymous.

OR tweet your 10 choices to @C4LPT from a valid Twitter account that provides your credentials as a learning professional.

We are very proud that in previous years eFront users have voted for our tool. To see 2012 winners please click here. We will be getting our lists together before the closing date of September 27th, and we suggest if your favorite elearning tool is eFront’s LMS please be sure to let them know!!

5 things beyond open source eFront

We recently put together a list of the coolest features (and the all-important ’6th element’) in commercial editions of eFront. From specialized reports to TinCan which offers the freedom to learn through more engaging ways, here we list the features that stand out the most:

1. REPORTS GENERATOR

There is no end to the need for specialized reports! And although it isn’t feasible to create a report for each and every individual, eFront Enterprise offers a flexible way around this by allowing users to design their own!

2. SKILL-GAP TESTS

Before even taking a course skill-gap tests can be used to determine which courses users should take. Skill-gap tests spot the gaps in user knowledge and map them with courses especially designed to fill those gaps. Skill-gap tests are an advanced functionality usually found in expensive talent management software – but they come as a standard in eFront Educational and eFront Enterprise.

3. BRANCHES

Multi-tenancy aka Branches is a hot topic for LMSs and for good reason.
If your organization has multiple teams and each team requires their own
courses, users and themes YET at the same time you, as an LMS owner,
need a centralized way to administer everything, then you most definitely need this functionality.

4. NOTIFICATIONS

Constant communication is a key factor in successful eLearning implementations. Notifications allow you communicate through automated, yet customized, emails, in an endless number of system events.
For example, you can send an email a week or a day before a course expires – this functionality comes in handy if users have forgotten to complete courses.

5. TINCAN

The world of eLearning standards is changing rapidly. The TinCan API, the successor of SCORM, promises a number of freedoms: freedom from the browser, freedom to work offline, freedom to use your mobile or tablet, and the freedom to learn through more engaging ways such as games. eFront is always at the forefront of such developments and offers exceptional support for TinCan API – and is one of the first LMSs to do so.

 - THE 6th ELEMENT –

We pride ourselves on our ability to offer professional hands on support for all commercial editions of eFront.

For more on our commerical solutions or the open source eFront edition please visit our website or contact us directly!

Open Source eLearning Network Event

eLN-logoOn July 12th, 2013 in Birmingham, England, the eLearning Network will be holding a day of practical case studies, sharing experiences and debate which will cover how you should procure the technologies to support learning in your organization. There will also be a number of experienced speakers and panel members present who will discuss topics such as: An introduction to open source elearning tools, Open Source LMS – Wouldya? Couldya? Shouldya’ and more.

You will also be able to explore a vast array of LMSs, authoring tools and other technologies including eFront, Exe, Ilias, Moodle, Xerte to name but a few.

The key questions which will be looked at on the day are:

  • Are there any examples of open-source software that isn’t designed for geeks?
  • When is it a good idea to look at open-source software?
  • How can we make open-source software work in our organisation?
  • Why would you choose software that doesn’t come with any support?
  • What are the objections your IT department may raise about opensource and how to address them?

This workshop will be useful to:

  • In-house L&D staff under cost pressure to get more from learning technologies
  • L&D practitioners (both in-house and consultants) who want to offer a wider blend of learning than classic self-paced e-learning
  • People who are frustrated that the proprietory softwares they use for learning are not flexible enough/customisable to their needs

For further information please go directly to the eLN event page: http://www.elearningnetwork.org/events/open-source

Thank you! eFront LMS is a finalist at Best of Elearning! 2013 Awards

Epignosis, the global vendor of Enterprise, Cloud and Open Source learning management systems, has been shortlisted at this year’s Best of Elearning! Awards.

2013-best-of-elearningFor a third consecutive year, Epignosis has been named a finalist for the eFront LMS at the Best of Elearning! 2013 Awards in the category ‘Best Open Source Solutions.’

Now in its ninth year, Elearning! Media Group hosts the only Readers’ Choice Awards in the enterprise learning and workforce technology market. Recipients are chosen by Elearning! Magazine’s reader community and users of elearning products and services via an open-ended online ballot. The winners will be named at the Best of Elearning! Awards and Luncheon hosted at the Enterprise Learning! Conference & Expo on Monday, August 26th.

Catherine Upton, Group Publisher, Elearning! Media Group said the following about the awards: Every finalist is a winner in the Best of Elearning! Awards. Given the high volume of votes and the number of nominated products, every one of these solution providers should be proud to be honored for excellence. The Best of Elearning! Awards programme formalizes the informal ‘word-of-mouth’ referrals practiced in our industry.

At eFront we are honored to be nominated by learning professionals around the world for the third year in a row. There is no greater honor than to be recognized by the executives and business managers who use your products and services.

We would like to thank everyone who nominated Epignosis’ eFront LMS as one of the best learning solutions this year!

See you on August 26th at the Enterprise Learning! Conference & Expo where the winner will be announced!

To read the official Press Release click here.

eFront 3.6.12 just released

What better time for new toys if not Christmas!

Today we would like to announce a new version of eFront. This maintenance update includes important speed optimizations, full text-search for documents for our enterprise clients, a module to bootstrap modules production, a new sleek modern theme, dozens upon dozens of minor bug fixes, and several tweaks to make your favorite tool even more enjoyable!

As we pick up once more on eFront development we will reuse some of the TalentLMS visuals and functions for a next version scheduled for March.  All eFront development is being done ensuring compatibility with previous eFront versions. This is a hard requirement that slightly limits our ability to improve everything we wanted to improve; but ensures an easy transition to the newest eFront version for all of you.

Below you can find short description of some of the key elements in this version:

It’s faster!

This version brings considerable performance updates for installations that have a large volume of data and users. Especially with branches and tests we have reworked substantially their underlying engines.

A module to bootstrap modules production

One of the key characteristics of eFront is its extensibility through modules. In this version we have bundled a bootstrap-module that lets you create an eFront extension by completing a form with required characteristics. It certainly won’t do all the work for you but it is a major time-saver if you plan to create an eFront module.

Full-text search for PDFs, DOCs etc

In efront 3.6.12 we undertook the project of implementing full-text search across all files uploaded to the system. This means you can search for text inside PDF, Doc or Excel files you upload to the system. This functionality in essence turns eFront into a simple but efficient document management system. We are using the power of the Xapian search engine, along with Open Office’s excellent conversion scripts, to provide for a seamless yet powerful integration with efront’s own search engine. More details on how to setup full-text search will be provided on a separate post (Linux only, sorry windows users!)

A new theme

This is the theme we currently use (see image below). We have tried to minimize clutter and offer full optimization for the end-user. You can find the new theme under Admin / Themes / eFront2013

A wealth of new functionality though modules

We’ve added new modules that many of you will find interesting:

  • “Idle users” module, to see at a glance which users haven’t been online for a while
  • “Course reports”, “Content reports”, “Branch reports”, to get a comprehensive list of useful data in a handy manner (Edu/Ent editions)
  • “Export unit”, to export a content unit to HTML with a single click
  • “Info-kiosk module”, to provide a single point of downloadable material for your users
  • “Outlook invitations”, that sends out calendar invitations for course schedules, compatible with Microsoft Outlook’s iCal (Edu/Ent editions)

Other improvements

  • “Empty spaces” questions now accept number-ranges. For example you can set as an acceptable answer a range 1-10. If the end user enters a number between these extremes it is considered correct.
  • Custom user profile fields can now be ordered according to the administrator’s preference
  • Limit access to a course based on access count (e.g, access a course 3 times)
  • Several improvements on the separation between Branches.
  • Download a certification directly upon course completion
  • Several SCORM improvements

For a comprehensive list of improvements on this version check the Changelog at: http://docs.efrontlearning.net/Changelog

TinCan Demystified

If you are somewhat interested in eLearning and unless you have been living on a deserted island for the last year you probably have already heard about the TinCan project. TinCan is heavily promoted as the successor of SCORM and was designed to fix many things that were lacking on the previous standard. In this post we discuss what TinCan really is and how it compares to SCORM.

The Tincan API resulted from several deliberations on SCORM 2.0 over the last five years. The standard is developed by the company RUSTICI but ADL is still the steward of the specification, just like SCORM. The Tin Can API is community-driven, and free to implement.

At its core, TinCan is a messaging system. You collect messages in the form of JSON statements about what your learners are achieving while learning or playing or interacting with other people. Those statements are stored on what TinCan calls LRS (Learning Record Store). The LRS can be either standalone or part of an LMS. The standard doesn’t touch on how you go about translating those messages into something useful. In its simplest form you simply present the statements in the form of “Noun, verb, object” or “I did this”. It is totally up to the LRS developers to make use of this data in any other way they see fit.

Compared to TinCan, SCORM was a very complex standard. It took our team around 8 months to build a SCORM 1.2 engine and more than 16 months for the SCORM 2004 / 4 edition. On the contrary, we spent only 1 month to complete a basic TinCan implementation for use with eFront and TalentLMS. Perceived simplicity is a core ingredient of the new offering and a major adoption point for LMS and authoring tools developers.

A nice side-effect of the messaging system is that any enabled device or program can send Tin Can API statements (mobile phones, simulations, games, real world activities etc.). On the contrary, SCORM was browser and LMS based only. As TinCan project put it People learn from interactions with other people, content, and beyond. These actions can happen anywhere and signal an event where learning could occur. All of these can be recorded with the Tin Can API.” This openness is very important and in our point of view, the biggest benefit that TinCan brings to the world.

TinCan also claims improvements on another commonly required but rarely delivered functionality – the ability to complete learning objects offline and synchronize when you get online. Even when not working completely offline people ask for better support for browser timeout and connection drops. SCORM depends on the browser session and such issues are common and catastrophic.

In reality, the new API offers little real help on this front. However, since the communication happens through simple messaging, client programs can easily store the messages when offline and communicate them to the LRS whenever the user returns to online status. No matter how basic this seems to be an efficient solution. Never underestimate the power of simplicity!

TinCan is very cryptic on a few prominent elements on SCORM like Packaging. The reason is that you might not need Packaging at all. Your learning object might be a mobile application or a game that does not run inside an LMS; Packaging has no value on such a loose-end environment. If you choose to import a TinCan package to your LMS though then yes, you will need to deal with content packaging, launch and import issues.[i]

TinCan has also little to do with the complexities of things like Sequencing. Do you remember what SCORM’s 2004 sequencing was? Let me refresh your memory…

In SCORM 2004, the sequencing is completely dynamic; the sequencing implementation identifies the next activity based on both Tracking Model and Sequencing Definition Model of activities. In fact, the values of Tracking Model are dynamic but the values of Sequencing Definition Model are static. Actually, in SCORM 2004, the sequencing implementation collects the result of learner interactions with SCO (through CMI data model) and maps them to the Tracking Model and then evaluates the sequencing rules (defined for activities) based on the Tracking Model.”[ii]

This sort of complexity led to very low SCORM 2004 adoption. From our experience over 90% of SCORM is still 1.2. Perceived simplicity is the reason. People just want to grab the raw score. The other things that SCORM 2004 offers are often in surplus to requirements. People often require SCORM 2004 support but rarely use it.

Our biggest complaint with SCORM was that it is a reference model and not truly a standard; you don’t plug this into a wall and everyone works the same way. There is still too much variation in how compliant SCORM LMSs implement UI associated with the SCORM RTE. Will content be loaded in a new window? A frameset? How large a window? How will the table of contents be presented? What navigation request does closing the browser imply? Content authors should be able to rely on a consistent set of UI expectations.

Unfortunately, TinCan does not help towards this standardization. On the contrary, it leaves even more freedom to content creators by letting them, for example, define their own verbs used on statements. Interoperability of content between LMSs can be somewhat improved due to the simpler messaging system and absence of Javascript; however, standardization of presentation or reporting will not be benefited from TinCan directly.

To summarize, TinCan brings many good things like simplicity and freedom from the browser and the LMS. On the other hand, it falls short on standardization of UI and reporting. In essence, TinCan tries to bridge elements of formal learning (mainly Reporting) with informal activities (e.g, browsing or game playing). We can foresee additional tools or sub-standards on top of TinCan to address real world issues especially with reporting and standardization of the verbs on statements.


[i] http://scorm.com/wp-content/assets/tincandocs/Incorporating-a-Tin-Can-LRS-into-an-LMS.pdf

[ii] http://stackoverflow.com/questions/12080589/with-a-scorm-2004-lms-and-or-scorm-2004-scos-can-a-teacher-change-the-sequenci