Working with Emerging Technologies

Wednesday, February 11, 2015 | 9:30AM–10:15AM | Laguna, Fourth Floor
Session Type: Professional Development

From Idea to Supported Product: A Process for Identifying and Introducing New Instructional Technologies
Pat Reid, Manager, Teaching and Learning Initiatives, Purdue University
Although our team is called Innovations in Technology and Learning, getting a new technology on campus as a centrally supported and widely used tool was difficult. With a steady flow of requests for new technology and no clarity on how to review, compare, or recommend technology, life was difficult. And with over 30 technologies already supported and limited staffing, recommending yet another tool was challenging. Furthermore, increasing analytics requests included tool use and value. As a result, we developed a process and set of templates including software features comparison, along with a recommendation and an implementation plan and a more structured faculty adoption plan.


OUTCOMES: Appreciate the difficulty in getting new technologies adopted * Identify a procedure to walk through technology support approval * Realize the need for a decision-making model for product comparisons * Develop an implementation plan to improve faculty adoption of the technology


How Useful Is That Widget? Multisource Methods for Thorough Evaluation of Instructional Technology Products
Virginia W. Lacefield, Enterprise Architect, University of Kentucky
With promising new instructional technology products being released all the time, many educational institutions find themselves running frequent technology pilots, each of which requires an effective evaluation plan to assess the product's overall value for enhancing teaching and learning. In this session, I'll show you how UK combined product usage data, qualitative and quantitative faculty and student feedback, and student grades to evaluate several different technology products used in F2F and online contexts. I'll also review some of the strengths and limitations of our methodology and offer tips for planning robust assessments of your own.


OUTCOMES: Learn about evaluation plans designed for different technology pilots * Understand how and when to combine "hard" usage analytics with "soft" user feedback data * Develop assessment strategies tailored for your own technology pilots' goals and contexts

Presenters

  • Jason Fish

    Director, Teaching and Learning Technologies, Purdue University
  • Virginia Lacefield

    Research Data Analyst, University of Kentucky
  • Pat Reid

    Dir., ID - Retired, University of Cincinnati

Resources & Downloads