Skip to main content

Analysing Collaboration, But Not As We Know It

Yesterday I went to a presentation about Analysing Collaborative Processes and Interaction Patterns in Online Discussions from researchers at the OU.

I found myself getting quite fired up, not in a good way, about their early work, which looked at how 12 students had worked on a collaborative task - generating 29 messages ( this was 2001 folks ). They went on to categorise the messages (by hand) like this...

  • Joint knowledge building
  • Asking questions, dialogue extension prompts
  • Supporting with reference or example
  • Acknowledging/ replying / referring to another message
  • Motivation and commitment to task
  • Instructions/information - coordination messages
... and then diagrams were drawn. I then found myself getting all worked up, not in a good way, about the diagrams, in which ( for me ) too much liberty had been taken with the spacial layout of the data, robbing it of potential meaning. For example, orphan messages were collected at the side, when maybe they should have been clustered ( is loneliness a shared thing? ).

I heard about some interesting projects that look to make "sense" of online discussions. For example, AcademicTalk looks at restricting the opening sentence of every message in a forum, essentially then categorizing it a certain way.

See also: Digital Dialogue.

I really like the idea of "constrained conversation" ... almost like a parlour game that forces you into somehow being more communicative. This reminded me of work by Simon Buckingham Shum on argumentation where you sort of construct a discussion from visual building blocks that I saw in the last millennium - you know, it feels like more than a thousand years ago sometimes.

But I found myself getting really worked up, and not in a good way, about the very idea of analysing discussion anyway. If you look at the numbers, the crudest form of measurement, you get crude results. If you choose to tag the discussions your perspective skews everything... and if you change the environment to something better ( than email ) then you've changed so much that measurement is pretty pointless anyway. Yes, you might be better able to understand the collaborative processes that are happening but they are in such an artificial environment your findings are meaningless.

And anyway, how do you even define collaboration anyway.... One person's successful collaborative experience might leave the other participants feeling exploited. Crowdsourcing anyone?

I was also reminded of Jer Thorpe's visualization work for the New York Times on the life-cycle of a tweet. Here, the crudest measures... tweets and re-tweets are shown in realtime in an infinite animated 3D space. It's the sort of thing we maybe all should have instead of an email intray... a collection of funky diagrams that we keep an eye on, jumping in when the ripples get too wobbly or when a diagram "goes quiet". ( Excuse me whilst I get worked up in a good way ).

There for me is the chicken and the egg. If you "hand categorize" your discussions, any worth is both tainted by the viewer ( pretty much like quantum physics ) or completely irrelevant in that any findings can't really be applied elsewhere. 

And if you work with the crude numbers, unless you have a LOT of data then you don't have patterns that might be spotted automatically. Imagine a conversation bot popping up mid flame war and saying " I notice that this discussion thread seems to losing its focus and becoming all about petty point-scoring, please desist! ". I could imagine that making all the difference to humanities ability to discuss things rationally.

All Of Which Leads Me To This...

An idea...

So, if you can't have any analysis that requires a researcher to add it, and you can't rely on the crude numbers what might you use instead?

Remember, I've already said that you also can't invent a fancy-dancey bells and whistles parlour game style environment to force people to behave differently so that you can now measure them properly.

You could just use email and forums. These forums and messages would have extra meta tools though ( which is only slightly fancy-dancey ) that would enable the participants to tag discussions, particularly for negative, anti-collaborative indicators.

Imagine that in the flow of a discussion forum, you could select a certain sentence and from a pop-up mark it as "Self aggrandising" or "Funny, but disrupting the flow of the discussion" or "Rude and disrespectful" or "Missing the point" or "Deluded". Now imagine that these scores were anonymous... but aggregations of them were shown on your profile.

The point of this, is that when things are good, most people don't see it. Good interaction design isn't even noticed, it disappears. Also, making negative, perhaps harsh judgmental comments that are attributed to you is quite a challenging thing to do... " passive aggressive "... "bullying!" etc. But it's often clearer why collaboration DOESN'T happen, than agreeing when it does.

I've always longed for an "Unlike" button in Facebook, not because I'm a snarky miserabilist, but just because I want to show disapproval without getting into long pointless arguments about it. And maybe, the only person who gets to see this feedback is the person at whom it is directed.... maybe only after you've gone beyond a certain threshold.

Technically, this would work just the same as regular emails and forums, albeit with a few extra markup tools. It might show us what mixtures of "types" of people produce good and bad collaborative experiences or it might show us who the Naysayers are in an organisation.

Maybe looking for what makes good collaboration based on what is there is a bit like understanding space... you need to look at what's not there, the black stuff, dark matter, to understand how it all works.


  1. This is natural way to make collaboration with people and get positive return. The company can be Nerdio only if we make super fab efforts and let people do to use their brain.


Post a Comment

Popular posts from this blog

Inserting A Google Doc link into a Google Spreadsheet

This article looks at using Apps Script to add new features to a Google Spreadsheet.

At the University of York, various people have been using Google spreadsheets to collect together various project related information. We've found that when collecting lots of different collaborative information from lots of different people that a spreadsheet can work much better than a regular Google Form.

Spreadsheets can be better than Forms for data collection because:

The spreadsheet data saves as you are editing.If you want to fill in half the data and come back later, your data will still be there.The data in a spreadsheet is versioned, so you can see who added what and when and undo it if necessaryThe commenting features are brilliant - especially the "Resolve" button in comments.
One feature we needed was to be able to "attach" Google Docs to certain cells in a spreadsheet. It's easy to just paste in a URL into a spreadsheet cell, but they can often all look too si…

Writing a Simple QR Code Stock Control Spreadsheet

At Theatre, Film & TV they have lots of equipment they loan to students, cameras, microphone, tripod etc. Keeping track of what goes out and what comes back is a difficult job. I have seen a few other departments struggling with the similar "equipment inventory" problems.

A solution I have prototyped uses QR codes, a Google Spreadsheet and a small web application written in Apps Script. The idea is, that each piece of equipment ( or maybe collection of items ) has a QR code on it. Using a standard and free smartphone application to read QR codes, the technician swipes the item and is shown a screen that lets them either check the item out or return it.

The QR app looks like this.

The spreadsheet contains a list of cameras. It has links to images and uses Google Visualisation tools to generate its QR codes. The spreadsheet looks like this.

The Web Application The web application, which only checks items in or out and should be used on a phone in conjunction with a QR cod…

Getting CSV data into Google Spreadsheets Automatically

Today I was attempting to get CSV data from Estates' Alarm System into Google Docs as a spreadsheet. There were two ways to try and achieve this...

Create an AppScript in Google that pulled a .CSV file from a web serverWrite a (python) script on the local machine that pushed the data into Google Spreadsheet by using the API. The Google AppScript Way As you know, my JavaScript ain't great, but it initially looked like it was going to work... Some code like this below and using the Array to CSV functions from here, looked promising.

function encode_utf8( s ){
//This is the code that "I think" turns the UTF16 LE into standard stuff....
return unescape( encodeURIComponent( s ));

function get_csv(){
var url =' BA Alarms.csv';// Change this to the URL of your file
var response = UrlFetchApp.fetch(url);
// If there's an error in the response code, maybe tell someone