Showing posts with label architecture. Show all posts
Showing posts with label architecture. Show all posts

Tuesday, 27 May 2008

Presentation on VREs/MREs

Thanks to Rachel for pointing this out: Interesting presentation from the Eduserv Symposium...by David Harrison of Cardiff Uni...
http://www.eduserv.org.uk/foundation/symposium/2008/presentations/davidharrison

Thursday, 22 May 2008

IT delivering value

Another interesting article on delivering value in Computing (15 May 08) - "Harnessing IT value" reporting on research at Cranfield which came up with 6 competencies that organisations should have if they want to deliver value through IT:
creating strategy
  1. defining the information system contribution : translating strategy into processes, information and systems investments
  2. exploiting information : to maximise benefits
  3. defining the capability : long term planning of architecture and infrastructure
  4. implementing solutions
  5. delivering IT supply

Monday, 7 January 2008

"webtop" vs desktop

Just came across this story http://knowledge.wharton.upenn.edu/article.cfm?articleid=1832 which follows on from previous posts on 4 Dec and 11 Oct.

Thursday, 3 January 2008

Cloud computing

Over the Christmas period, Bill St Arnaud posted a couple of interesting items on cloud computing. He also points to an excellent presentation by Savas Parastatidis. I'm not going to explain the background as the links here will do that much better than I can...

Advocates explain that cloud computing offers advantages over grid computing - clouds are potentially more powerful and crash-proof. And there is the idea that outsourcing the infrastructure can save institutions money and drive the environmental agenda.

The downsides associated with cloud computing include: immature standards (though this seems to be changing); inadequate access to high speed connections; data protection concerns.

The big players are all involved in cloud developments - Microsoft, IBM, Yahoo, Google, Amazon. Google is starting to work with a handful of US universities - University of Washington, Berkeley, Stanford, MIT, Carnegie Mellon, University of Maryland - with a view to expanding later to work with more, globally. Amazon is already offering its Simple Storage Service (S3) and is developing its Elastic Compute Cloud (EC2).

The Wikipedia entry for cloud computing points to a couple of interesting articles:

Tuesday, 4 December 2007

Email : outsource or inhouse?

Thanks to ResourceShelf for pointing out this story:
When E-Mail Is Outsourced
http://insidehighered.com/news/2007/11/27/email
This looks at some of the issues institutions are now facing (although US focused) in deciding how to move ahead with email and other services. Microsoft and Google have both marketed to the higher education sector and offer the benefits of integration. But the choice facing institutions is not simple and raises a number of questions relating to:
  • cost/value
  • role of IT services
  • privacy and ownership of data
  • advertising
  • the value of an ".edu" or ".ac.uk" email address
  • capacity to innovate
  • support required
  • ability to influence priorities for development.
Relates to a story on BBC News (http://news.bbc.co.uk/1/hi/education/6741797.stm) earlier this year and to the move towards "software as services" mentioned in my posting on 11th October (http://ali-stuff.blogspot.com/2007/10/software-on-demand.html).

Tuesday, 27 November 2007

Geospatial Knowledge Infrastructures Workshop

I managed to catch some of the Geospatial Knowledge Infrastructures Workshop today, part of the eSI programme and hosted by the Welsh eScience Centre. Here are my quick notes...


Rob Lemmens from International Institute for Geo-Information Science and Earth Observation talked about end-user tools. He outlined the different approaches of corporate/national Spatial Data Infrastructures (SDIs) which is a centralised approach and Web 2.0 which is community driven. SDIs are based on stricter rules for annotation and accuracy tends to be higher than Web 2.0 tools, although this is changing. Rob outlined the need for a semantic interoperability framework (combination of ontologies, their relationships and methods for ontology-based description of info sources - data sets, services etc) and a semantic interoperability infrastructure (comprises framework and the tools to maintain and use the framework as well as the information sources produced within this framework). Rob's presentation also included a slide outlining the characteristics of an ontology which was a good representation and a demonstration of ontology visualisation (same tool which ASSERT is using for clustering?). Rob concluded by summarising what the geospatial community can learn and take from Web 2.0, for example tagging/tag clouds, tools for building ontologies (community tagging e.g Google Image Labeller), instant feedback (e.g. password strength bars when selecting a new password) - on the negative side, community-driven tagging can lead to weak semantics. Rob suggests combining the best of both SDI and Web 2.0 worlds - map the SDI and Web2.0 ontologies to create dynamic annotations of geo sources, thus improving discovery.



Ulrich Bugel from Fraunhofer Institut IITB presented on ontology based discovery and annotation of resources in geospatial applications. Ulrich talked about the ORCHESTRA project (http://www.eu-orchestra.org/) which aims to design and implement an open service-oriented architecture to improve interoperability in a risk management setting (e.g. how big is the risk of a forest fire in a certain region of the Pyrenees in a given season?). This question has spatial references (cross-border, cross-administration); temporal references (time series and prognostics); thematic reference (forest fire); and conceptual reference (what is risk?). ORCHESTRA will build a service network to address these sorts of question. Interoperability is discussed on 3 levels - syntactic (encodings), structural (schemas, interfaces), semantic (meaning). The project has produced the Reference Model for the ORCHESTRA Architecture (RM-OA), drawing on standards from OGC, OASIS, W3C, ISO 191xx, ISO RM-ODP. Many iterations of the Reference Model which led to Best Practice status at OGC. The ORCHESTRA Architecture comprises a number of semantic services: Annotation Service automatically generates meta-information from sources and relates them to elements of an ontology; Ontology Access Service enabling high-level access and queries to ontologies; Knowledge Base Service; Semantic Catalogue Service.



Ian Holt from Ordnance Survey presented on geospatial semantics research at OS. OS has one of the largest geospatial databases, unsurprisingly, with 400 million features and over 2000 concepts. Benefits of semantics research: quality control, better classification; semantic web enablement, semi-automated data integration, data and product repurposing; data mining - i.e. benefits to OS and to customers. OS has developed a topographic domain ontology which provides a framework for specifying content. www.ordnancesurvey.co.uk/ontology. Developed ontologies for hydrology; administrative geography; buildings and places. Working on addresses; settlements; and land forms. Supporting modules on mereology, spatial relations, network topology. Conceptual ontology- knowledge represented in a form understandable by people vs computational topology - knowledge represented in a form understandable by computers. A controlled natural language called Rabbit has been developed - structured English, compilable to OWL. OS is also part of the OWL 1.1. task force to develop a controlled natural language syntax. A project currently underway developing plug in for Protege with Leeds University - allows natural language descriptions and in the back end, will translate into an OWL model. The first release is scheduled for December with further release planned for March 08. Ian also talked about experimental work to semantically describe gazetteers - an RDF version (downloadable?) to represent the data and OWL ontology to describe the concepts. This work includes administrative regions and work underway to include cities etc. Through their work, OS has experienced some problems with RDF - e.g. may degrade performance (they have >10 billion triples); how much is really needed?. Ian described some work on semantic data integration e.g. "find all addresses with a taxable value over £500,000 in Southampton" so looking at how to merge ontologies (i.e. creating another ontology rather than interoperability between the two). Ian briefly covered some lessons learned - ontologies are never perfect and can't offer complete descriptions of any domain; automatic tools are used as far as possible. Ian also describe work on linking ontologies to databases using D2RQ which maps SPARQL queries to SQL, creating "virtual" RDF. Conclusions : domain experts need to be at the centre of the process; technology transfer is difficult - benefits of semantics in products and applications must be clarified.


Alun Preece from Cardiff University presented on an ontology-based approach to assigning sensors to tasks. The idea is to bridge the gap between people out in the field needing to make decisions (e.g. disaster management) and the data/information produced from networks of sensors and other sources. Issues tackled: data orchestration (determine, locate, characterise resources required); reactive source deployment (repurpose, move, redeploy resources); push/pull data delivery. The approach is ontology-centric and involves semantic matchmaking. Work on proof of concept - SAM (Sensor Assignment for Missions) software prototype and integration with a sensor network. This work is funded by US/UK to support military application - intelligence, surveillance and reconaissance (ISR) requirements. The work uses ontologies to specify ISR requirements of a mission (e.g. night surveillance, intruder detection) and to specify the ISR capabilities provided by different asset types. Uses semantic reasoning to compare mission requirements and capabilities and to decide if requirements are satisfied. For example, if a mission requires Unmanned Aerial Vehicles (UAV), the ontology would specify different types of UAV and the requirements of the mission (e.g. high altitude to fly above weather, endurance) and the semantic matchmaking (exact, subsuming, overlapping, disjoint) then leads to a preferred choice. The project has engaged with domain experts to get the information into the ontology and to share conceptualisations. Alun showed the Mission and Means Framework Ontology which is a high-level ontology which is fleshed out with more specific concepts.

Slides from the workshop will be uploaded to http://www.nesc.ac.uk/action/esi/contribution.cfm?Title=832

Wednesday, 21 November 2007

Breaking down digital barriers - report and case studies

Thanks to docuticker (http://www.docuticker.com/) for pointing out a report on Breaking Down Digital Barriers (http://www.docuticker.com/?p=17844). One of the three case studies looks at interoperability issues with mashups:
"Most clearly among our three case studies, the area of Web services demonstrates the manner in which interoperability can stimulate large-scale innovation."

Friday, 16 November 2007

Skills required to support SOA and integrated services

From Computing 15 November 2007, article "Fitting the skills together" by Jim Mortleman:
"..analyst Gartner predicts that four out of five companies will have taken the SOA route by 2010...SOA involves a fundamental change to the way firms think about IT - namely, as a series of interoperable business services, rather than as discrete IT systems."

The article also quotes Nick Masterton-Jones, IT Director of Vocalink: "I think SCA is something we're going to see a lot more of in the coming three years" SCA is Service component architecture "an open SOA promoted by major Java vendors to bridge the gap between people who understand the business domain and people who understand system design".

Monday, 29 October 2007

Data management - learning from commercial sector?

Computing (25 Oct 07) has an article on data management, featuring BAE Systems as one of a series of case studies. BAE estimates "80% of networked employees were wasting an average of 30 minutes a day retrieving information , while 60% were spending an hour or more duplicating the work of others".

The article acknowledges the cultural barriers to using/sharing data and suggests policies are put in place to establish guidelines and principles, as well as training and mentoring to help develop the collaborative and information management skills required.

One of the case studies, Denton Wilde Sapte, cautions "People are so wrapped up in the technical whizz-bangs that they forget that IT is really all about information delivery".

"Organisations are recognising that some pieces of their information have more fundamental value than other parts, although that value might not be realisable today. For certain items of information its maximum value will only be achieved at some point in the future, so companies need to invest in good archiving, storage, search and retrieval systems today" Ian Charlesworth, Ovum, quoted in the article.

Friday, 12 October 2007

Gartner's Top 10 strategic technologies for 2008

Thanks to Bill St Arnauld for pointing to this on his blog:
At the Gartner Expo this week, the following were discussed as the top 10 technologies organisations can't afford to ignore...

  1. Green IT
  2. Unified communications (interesting for VRE programme)
  3. Business Process Management (to support SOA)
  4. Metadata management
  5. Virtualisation 2.0
  6. Mashups and composite applications
  7. Web platform and Web-Oriented Architecture
  8. Computing fabrics
  9. Real World Web
  10. Social software

Thursday, 11 October 2007

Software on demand

Computing (11 October 2007) this week features a story by Tom Young, Online software is in demand, which talks about new products launched recently, which "are hosted and accessed in real time rather than being installed on in-house systems". Adobe, IBM, Google, Yahoo are all either developing or releasing products, in an attempt to compete with Microsoft's dominance. The software-on-demand model offers a number of benefits around updates, licensing, virus protection, flexibility.

Sunday, 7 October 2007

SEASR

Thanks Frederique for pointing to this - Chris Mackie mentioned this in a meeting earlier in the year but I hadn't followed it up since...

http://www.seasr.org/

From their website:

"SEASR (Software Environment for the Advancement of Scholarly Research) is being developed by the National Center for Supercomputing Applications in cooperation with the Graduate School of Library and Information Science at the University of Illinois at Urbana-Champaign.


SEASR aims to:

  • assist scholars in accessing and analyzing existing large information sources more readily and with greater refinement;
  • give scholars increased portability of large information stores for on-demand computing; and
  • empower collaboration among researchers by enhancing and innovating scholarly communities’ and their resources’ virtual research environments.

How will we do it? The SEASR development team will construct software bridges to move information from the unstructured and semi-structured data world to the structured data world by leveraging two well-known research and development frameworks: NCSA’s Data-To-Knowledge (D2K) and IBM’s Unstructured Information Management Architecture (UIMA). SEASR will focus on developing, integrating, deploying, and sustaining a set of reusable and expandable software components and a supporting framework, benefiting a broad set of data-mining applications for scholars in the humanities.

SEASR’s technical goals include supporting:

  • the development of a state-of-the-art software environment for unstructured data management and analysis of digital libraries, repositories and archives, as well as educational platforms; and
  • the continued development, expansion, and maintenance of end-to-end software system: user interfaces, workflow engines, data management, analysis and visualization tools, collaborative tools, and other software integrated into a complete environment."

Thursday, 4 October 2007

UIMA

This week's Computing (4 Oct) mentions 5 information management technologies to watch out for in the next 3 years: