Methods Fishbowl Discussion Notes

From Data-Intensive Collaboration in Science and Engineering
Jump to: navigation, search
  • Project-level unit of analysis: dissertation of LTER, NEON OSG, etc -- recommended to move to instituational levels and organize the same sorts of methods toward whole institutions
  • What are institutions? How do we analyze institutions and networks?
    • Imposing an institution on the site, concurrency issue of boundaries
  • Who are the individuals that are crossing projects and institutions?
  • Vygotsky’s unit of analysis based on purpose of research: What are you trying to see and develop over time? Are you looking at the individual, activity, institution changing over time? Select something that crosses many boundaries (for example Activity Theory’s internal/external of an activity)
    • Social Network analysis can scale, but Actor-Network is difficult to scale
      • Nardi & Monge applied ANT to social network, multiple mode network analysis
      • “Many Nets” application allows for following mutiple actors over time
    • Bonnie Nardi’s 6 criteria for investigating activities: ask concrete examples of how things work, similar to Critical Incident Methodology
    • Engestrom’s activity systems analysis: “change laboratories” which is a process of coming up with analysis in collaboration with the group you are studying - what is meaningful to the people: changes the role of the academic to activity hoping to change it
  • There is not a cure-all unit of analysis: this should be context-specific to our research questions
  • Research Question is one route of determining unit, but what about potential impact?
    • Ex: Open Data Grid, and the complexity of collaboration - scale to production of scientific papers, “What software did you use to complete this paper?” but the question was asked “How did you get to this paper?” - great for interviews
    • Carlston’s work following documents through hospitals
  • How do we capture the temporal dimension of capturing cyberinfrastructure?
  • Failure may be the most interesting focus for understanding value, ecosystem: How do you get people talking about failures or uncoer failture?
    • Qualitative analysis of what didn’t make it to publication?
    • Instead of asking about failures directly, ask about processes or activities that may have not been recorded (tensions, difficulties, roadblocks, etc)
    • Traces of error through bugtrace
    • Feed egos/soften! Start with discussion of their accomplishments, then they will be more open about failures once they feel less threatened of their own validity as a researcher
  • Grounded theory notion of conditional matrix: Why? What happened? May be a policy, condition, norm that colored how a success or failure reaches
  • Observation does not give us causation: so how can we attribute causation?
    • Is experimental work the only way to directly attribute causation?
    • There is room in cyberinfrastructure to be novel with the combination of method to capture complex sociotechnical phenomena
    • Human/the social scientist's pattern recognition of dynamic systems
    • “You can do interesting case studies forever, but it won’t necessarily advance the field. You want to ensure that you can advance the field, though our field allows for both forms.”
  • Interventions:
    • Bottom-up design changes and pockets of opportunity for incremental improvement (How do we know improvement, how can we evaluate “effective” in cyberinfrastructure?)
    • BACK UP! Before we can do interventions we have to make ourselves (as social scientists) critical to the process!
      • EarthQ: Cogburn’s NSF EAGER awards (combination GEO, OCI, VOSS, SOCCS): Effective in counter-balancing this lack of recognition of the critical need for social science in hard science
      • DataOne w/ Dana and Matt Bietz
  • 4S Panel on taking the STS Canon Digital:
    • How has ethnography changed in the face of a highly technical laboratory?
  • Learning analytics (at HICCS):
    • The many online sources of data and prediction
    • Algorithms for understanding others’ understanding (how meta!)
    • Measuring through the success/failure of objectives
  • Dana’s “Extreme Ethnography”:
    • Ethnography with trace data and network analysis, natural language processing
    • Answering the “Why?” questions: trace data is not enough to answer these!
Personal tools