Monday, 19 December 2011

Mon 19th Dec 2011 - (3) - Meeting with Company B

Mon 19th Dec 2011
3 hours - meeting with Company B
**************************

Observation session with Company B with Usability Study

Non technical person was asked to use the product to create a solution for their problem. By problem I mean to make an existing work more beneficial/faster/automate.

The session was monitored by two technical persons the Brand Manager and the Scrum Master with me sitting in and observing.
Mouse clicks, facial expressions, and audio was recorded during this hands on session.

This study was done in the format of guided help. This means the non technical person used the product to solve their issue but guided help would be provided.

There were questions arising out of this by user, in terms of
language and definitions
navigation of product
there was 2 parts that were not intuitive to the user based on how the user mind conceptualize it but a reasonable work around was found

At the end the user was interested in learning more about the product and was able to see what they designed could be used to make work for a group of people more efficient.

The user saw the value of the solution created.

( not entirely sure if this was the case in the original Study with no guided help)

A couple bugs arose but they logged to be fixed.

Based on today's session I think they are on the right track with the testing study.
****

Next steps for them:
--------------------
-Iron out bugs, get ready for Early Adopters Programme ( Dec & Jan)
-Meeting for features design with UI expert to make changes (Jan 2012)
-Meeting with Early Adopters to see if the product meets their needs and possibly buy into to idea of the                 product. Beta test for 3 months with support (To start possibly in Jan or Feb 2012)


Hopefully  I will be allowed to sit in those meetings.

The objective of those meetings for me is to get an appreciation of the work process at Company B and to see what they are doing right or not so right, things that can be improved etc.

Thursday, 24 November 2011

Nov 24 - Project discussing with UOG person (3)

Guidance on the project. Question and answer segment.

very beneficial. Now more focused on the project.

Nov 24 -System Integration Testing -SIT at GG (17 hours)

System Integration Test - SIT at GG  1 week = 5 days.

Researched them for 17 hours over the course of the week.

Spent time working with them ( PEOPLE FROM OTHER ISLANDS) and jotting down notes during their training/testing session.

PROCESS OF field research:


OUTCOME from it:

Nov 24 - Meeting at company B - (1)

 1 hour
Added notes to the Chapter outline.
Plan to go to user meeting ie next week.

***** TIME CHECK - 57 hours done so far. *****

Monday, 7 November 2011

Nov 7th (3) - notes from Meeting with HF

Users to survey later on :


 Ann Marie Spring - Automatic Payment Allocation
Mya - Home Blanket Scheme


Carmen, Tasmin, Bibi - Renewal Reports to Excel
Lockey - Margaret, filing clerks  - Physical File Tracking System
Motor home - Mya



e.g.
Automatic payment allocation – simple – good.  This is the normal accepted flow of an enhancement request requiring a system modification:
·         FRNT4077 – first request
·         \\Pelican\Departments\GGL - Information Technology\System Requests - Tests\GGLS1392 - Automatic Payment Allocation Agos J&C RMSGGL – look for BIF word document for further investigation of request
·         GGLS1392 – System modification information
·         Verification tests in the above pelican folder
·         Sign off received from requester
Good example of a task that went well. 
The request was clear and straight forward and followed another task. So this made it simple.
Email request was clear concise with example or reference given
 BIF was done - completed by BA

We do not measure our cost/quantitative our work - which is bad. This can be used for marketing of IT plus promote what we do as well as create a deeper appreciation for us as well as reduce the mystery around IS.
 It got the green light to go ahead and was prioritized.
It was fixed and verifed/tested, manual updated.
Promoted signed off.
Certain areas/tasks users are brought in to test. There is the perception that the BAs are skilled enough to test thoroughly. 
Observations if the requirements part of it is messed up then the whole thing becomes messed up. It is key to get the requirements correct first. 
Developers need t design with the correct requirements. Proper analysis must be done at the beginning.
******

Home blanket upload – delays experienced
·         Initial request very bland via Manager
·         O:\GGL - Information Technology\Enhancement Requests\2010\Home Blanket Certificates - BIF completed to gather more details.  This was continuously being refined view the timing between the first and last BIF – analysis was not thorough enough
·          Sign off received from requestor …. There should have been a proposal to follow up a part 2 – not formally followed up
it was not done the front line request. but a manager request to the head developer.
Proper requirements  was not done. Staff was short at that time. Close to end of the year plus there was no negotiating.
there was not a strong business analyst on hand to gather requirements.

requirements were gathered from one who gave a detailed view of the task. IT was under the perception that it was complete. Laizing with other people who use it was not done. He was not the one who directly interfaced with.
more people should have been consulted.

It was a brand new functionality from manual procedures.
Because the requirements were not complete, requirements kept changing. Taking longer.
Costs in terms of time and money.They should have said what they wanted front perhaps.

people signed off on it. 

Need to improve on analysis and requirements gathering. How can we do this?
Since it was brand new functionality it should have been treated as a mini project. TO take in consideration all the phases: for eg: launch date what needed to be done, hardware and software requirements.
Needs a string BA to shift through the stuff.

How do you make the most of a bad situation?
What do you do when you have red flags and how do you deal with reality when there are staff and time constraints?

Physical File tracking system:
WIA had a bandwidth issue. When the task was ready the user was not available at the time. There was no one to market the issues in the other islands.
Once the bandwidth issue was resolved people came on board and used the task.

Investigation into not just the task but the feasibility of the issue in the given environment.

Perception : IS know what they want - hence there is not a need for being in depth and specific about  requirements.
Depends on the people:  For instance you need the managers driving the change. In the other islands you need this kind of environment.
Trans nemwil had that support when they transitioned to i90. He had his staff on board and pushing the change. The team made sacrifices both sides is and trans nemwil and it paid it. Smoother transition.

Guyana did not have the push from management. hence took longer and still doing support for them.

**** 
Renewal listing
functionality did not change but the work flow changed. this what can turn out tramatising.
People not longer waiting for hard copy of reports but had to generate it themselves and use excel. 

Drive was for people to take ownership of it. each department.

there was alot of hand holding. demos. etc.

newer staff picked it up faster. Older staff -p hand holding.

**************** 
Physical file tracking 

Live meeting helps with long distance communication, demos, hand holding. support. It can work to bridge the gap in islands. Might lead to faster adoption.

We have to be careful not to force our own perception of how it should be on the users.
We should not dominate the requirements with our own needs.

For instance the way we use a system is not the way the users might be using the system. We have to understand the users. 
Renewal reports to excel – workflow changes were involved.
Physical filing management system  - the user who requested it was not ready when it was – similar to image processing
Motor home – the objective was unclear amongst all parties.

Market IT - to promote to IT.

How does Scotia or RBC roll out to users since they are not in the same country.



Nov 2nd - Literature Review (3)

3 hours:

Literature Review


HOW HABIT LIMITS THE PREDICTIVE POWER OF
INTENTION: THE CASE OF INFORMATION
SYSTEMS CONTINUANCE1  - ACM

Nov 7th - (1.5) Meeting Helen Ford - on Task selection @ GG



From: Marina Maharaj
Sent: Friday, November 04, 2011 2:16 PM
To: Helen Ford
Subject: Objectives of today's meeting

Meeting with Helen Ford as at Fri 4th Nov 2011
Objectives
·         Find projects that are good  and bad cases to work with for investigating user acceptance
·         Select users to work with to set up interview/surveys

Projects with the Criteria:
Ø  There was room for improvement
Ø  Some things were not handled properly
Ø  People part of it that was problematic
Ø  Team support was it there
Ø  Managing expectations of managers and users
Ø  Users and Managers – different language/perspective
Ø  Where generally some conditions were just not right or conducive for success
Ø  Projects that works well, what were the conditions for it that allowed this success in user acceptance



From: Helen Ford
Sent: Friday, November 04, 2011 4:30 PM
To: Marina Maharaj
Subject: RE: Objectives of today's meeting

e.g.
Automatic payment allocation – simple – good.  This is the normal accepted flow of an enhancement request requiring a system modification:
·         FRNT4077 – first request
·         \\Pelican\Departments\GGL - Information Technology\System Requests - Tests\GGLS1392 - Automatic Payment Allocation Agos J&C RMSGGL – look for BIF word document for further investigation of request
·         GGLS1392 – System modification information
·         Verification tests in the above pelican folder
·         Sign off received from requestor

Home blanket upload – delays experienced
·         Initial request very bland via Manager
·         O:\GGL - Information Technology\Enhancement Requests\2010\Home Blanket Certificates - BIF completed to gather more details.  This was continuously being refined view the timing between the first and last BIF – analysis was not thorough enough
·          Sign off received from requestor …. There should have been a proposal to follow up a part 2 – not formally followed up

Renewal reports to excel – workflow changes were involved.
Physical filing management system  - the user who requested it was not ready when it was – similar to image processing
Motor home – the objective was unclear amongst all parties.


Regards


Helen Ford
Team Leader – Information Systems

Wednesday, 2 November 2011

Oct 30 - Literature Review (4)

******** ******** ******** 
Oct 30 - Literature Review
4 hours
4 months Left
******** ******** ******** 



Measuring the influence of social abilities on acceptance
of an interface robot and a screen agent by elderly users - ACM





Consumer Mashups: End-User Perspectives and
Acceptance Model  ACM

Saturday, 29 October 2011

Sept 10 - Survey Work for GG

Sat 10th Sept 2011
//  *********************************************
//  Project Deadline                                : March 30Th 2011
//
//  Generate Survey and select projects  :
//  *********************************************

Generate Survey
Select Projects Successful and less successful projects/enhancements/system requests

Oct 29 - Mon Tue Testing session at GG (16 hours)

**********************
Training/Testing session to introduce the users to ProfoundUI in Company 1
 2 days of testing  16 hours
**********************

Training Testing sessions were done Mon and Tue for 2 full days.

Brought in users from different departments 5250 users and genie users.

Slow and fast processors. they got the process transactions what they do daily as well as another area where they might not be familiar with.

So they had to test all areas. This was done with the new ProfoundUI skin or interface.

They were initialed to it for the first few hours then go. they would test as if they were processing in the live environment.
Any issues found if it stopped them they were fixed almost immediately.

it was a one on one test session. ie User was using the system while BA and Programmers walked around and noted issues, positives and negatives.

Previous issues of previous interface - genie were fixed. ie speed. so it was not the same set backs as before.

Over all - positive feedback. 4 out of 6 users said they could use the system now as it and be fine.


outline: BA tested and programmer got all major issues out of the way asap for User testing on these 2 days then the users tested. gave feedback and then bugs were collated to be fixed again.

Because most of the issues were already out of the way it was not traumatizing to the user and they could process throughout without stopping.

User concerns, recommendations were taken into consideration so they had a voice, they were empowered to make comments and it would be listened to.

Cross section of users tested so it was a good test group to represent the masses.

Recommendations were minor. No major changes. Over all they were satisfied

The next training session will be on when all the islands come for training which is 3rd week in Nov 2011.
And each user testing will be a market person in their are for the product.

Who knows maybe this group will go on to tell positive stories to users downstairs. Promoting the name of the interface. and encouraging buy in.

Oct 29 Initial Report (1.5)

Initial Report

1.5 hours
****
Completed and emailed in to supervisors

Wednesday, 19 October 2011

Oct 19 - Literature Review (2)

2 hours
literature review
5 months left

reviewed an article from acm:


PRACTICE MAKES USE: USING INFORMATION TECHNOLOGIES BEFORE IMPLEMENTATION AND THE EFFECT ON ACCEPTANCE BY END USERS

User acceptance of technology
Top down management support
Provide training,
Communication with users,
Allow users to contribute and participate in decision making
Ensure that users do not feel threatened by job insecurity when introducing IT. Do this by training them

Davis (1989; Davis, Bagozzi, and Warshaw, 1989)
examined perceived ease of use and perceived usefulness
and its relationship to acceptance of information
technologies. Perceived ease of use was defined as the
degree to which a prospective user expects the technology
to be free of effort.
Perceived usefulness was a
prospective user’s perception that a specific technology will
enhance his or her job performance. Perceived usefulness
focuses on the payback to the end user for using the
information technology and is anticipated to relate directly
to end users’ expectations about the information
technology (Kleintop, 1993a).

Perceived usefulness was not a significant predictor of
actual use when entered in Block 2 of the hierarchical
regression anal yses shown in Table S. This is not
consistent with Davis, et al. (1989) who found that
perceived usefulness was predictive of system usage.

Sunday, 16 October 2011

Oct 16 - Literature Review (3)

***************
Sun 16th Oct 2011
Time spent: 3 hours
Deadline 5 months
***************

Continuing with Literature Review:

http://doit.maryland.gov/SDLC/Documents/User_Acceptance.pdf
Definition of User Acceptance

User acceptance is the confirmation, through testing, that the delivered system meets all
requirements, functions according to design parameters, and satisfies all business, technical, and
management stakeholders.


Effective user acceptance criteria:
• Are based on functional and non-functional requirements
• Are specific, unambiguous, measurable, attainable, and realistic
• Contain specific pass or fail criteria
• Address all aspects of the system in detail
• Indicate if a requirement is critical or non-critical (i.e., not required for acceptance)
• Identify unacceptable errors


The execution of user acceptance testing is most successful when:
• Predefined and approved user acceptance criteria exist
• Users of the system perform the tests
• The test environment simulates real-world usage conditions
• Tests are conducted on a completed system that has passed unit testing, integration testing,
and system testing
• Tests are conducted utilizing test cases that cover all scenarios.  Test cases describe the
functionality (scenario) being tested, input, expected result, actual result, pass/fail status and
rectification strategy for the problems discovered, test run date and time, name of the
person/role who ran the test
• All data has been migrated/converted (if applicable) prior to user acceptance testing
• Test case execution is automated with test scripts (when possible)

Saturday, 15 October 2011

Oct 16 - Literature Review (3)

********************************
Today: 3 hours
Literature review.

Deadline 5 months
Deadline march 30th 2012

********************************

Have to create structure/chapters of the final report.

Training Mon and Tue GG - Must monitor user feedback.

Perhaps we can tap into the answers initially there.

Plan was try to fix as much bugs as possible so when Actual users test Mon and Tue they will get the issues additional bugs.
 Maybe we can introduce support at this time to listen, set the expectations this is a testing session not a training session.

Initial reaction can cloud later and spread rumors.
Hence it is important to send the right message and get it right the first time. positive message so positive messages can be spread.

file:///C:/Users/owner/Desktop/user%20acceptance%20thur%2025th%202011/Failed%20IT%20Projects%20(The%20Human%20Factor)%20paper.htm


Most software development projects fail because of failures within the team running them. Before a team project can be labeled as failed, the term failed must be defined. For the purpose of this research, a failed software development project is defined as (1) being over-scheduled by more than 30% and/or, (2) being over-budget by more than 30% and/or, (3) the end product did not meet user requirements. These three criteria cover the schedule, budget, and requirement, all surrounded by quality that should comprise all projects. A failed project does not include canceled projects relating to changes in requirements that resulted from changes in business needs. The focus of this research is on the project team problems of these failed projects, thus resulting in team failure.

http://www.it-cortex.com/Stat_Failure_Cause.htm

The main reasons why systems fail to meet their objectives were identified as:
bulletLack of attention to the human and organizational aspects of IT.
bulletPoor project management.
bulletPoor articulation of user requirements.
bulletInadequate attention to business needs and goals.
bulletFailure to involve users appropriately.

I want to stress on last one to some extent not just user involvement but using it appropriately


http://goldpractice.thedacs.com/practices/mr/index.php






this is important.

how does the strategy support this?

Tuesday, 27 September 2011

Sept 27 - Meeting with Company B (1)

Tue 27th Sept 2011

*****************************
Deadline April 30th 2011
Meeting with Company B 1 hour

*****************************


Advanced Systems Group

Team: Product Manager, Business analysts, 4 Test Users ( both technical and non technical), Developers
Project: workflow using Research and Development
Aim: Make it as user friendly as possible for both technical and non technical users. Functionally it has already achieved its goals


Methodology: DSDM, Agile, SCRUM, Microsoft framework Methodology

Plan:
2 weeks sprints of meeting with Product Manager and users to demo progress
- Stuff will be agreed upon and changes made, iterative

Test Lab - still being developed
- 3 sessions 1 hour each with user and technical staff
- sit while they use the share point work flow product

Daily 10 minute updates of status, recommendations, next steps, assistance

Roll out to clients with strict contract of expectations
Beta version will be sent to clients to test and feedback will be given to improve on the product
Poll users

Then the final version

( they do not have a high staff turnover, but they ensure that proper learning takes place and more than one person is quipped with the knowledge)

( Proper project management, proper technical skillset)


***************
Past:

SharePoint Foundation = free

Company B would provide solutions using share point to clients but internally there was a low acceptance rate.

Initiative was taken to explore internally to encourage use of share point
-Company Branding
-User friendly -> users comments etc
-Documents were stored on share point
- Top down
- Became the culture and practice to use the company intranet

***************
Past projects for Clients
Used share point to manage/create minutes for meetings
Sent to get sign off from Permanent Secretary
Business Process Optimisation
Train the Trainer approach was used
Feedback was used
Iterative
Initially it was scoped at 1 month but really took 6 month
Microsoft took that hit

Online Service for Insurance Company
Iterative, Microsoft System Framework
Bugs were reported and dealt with as stated in service agreement
Good project management
Compromise

Another project 2007 -2011
Scope kept changing
Staff changed
14 work flows were done rather than 1
The need to draw the line was not done in the project

Wednesday, 21 September 2011

Sept 21 - Survey work for GG (5)

Wed 21st Sept 2011

Deadline March 30th 2011
Time:   5 hours

Survey Created.
Emailed Helen (BA) about direction of survey
Emails sent to Company B and a bank for permission to work with them in final project

TO:
set up interview with BA
Agenda:
select project
select users
get her user exp

set up interview with users:
Agenda:
get their perception on system, organisation
values
trust issues
culture

Monday, 19 September 2011

Sept 9 - to do (1)

TO DO:

research TAM - for user acceptance.

What works what does not.

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC150360/

rated user acceptance by rating previous projects and new ones.

Email ravi about interviewing his place

Saturday, 10 September 2011

Sept 8 - Meeting with GG (2)



//********************************************************************
// Project Deadline                 :   March 30th 2011
//
// Meeting with stalk holders:   Fri 
 9th Sept 2011 
//                                                   2 hours
//******************************
**************************************


Objectives:


To make sure everyone is on board and knows the general direction of the project


Outcome:
Permission was granted to allow work of the User acceptance project.
Interaction with users must be done outside of working hours.

Issues arising from Meeting:

There is NOT low user acceptance at Company A. But there are acceptance issues.

Different Types of projects. Must dig deeper into this one. Not only RAD methodology used at company A.
Some successful others not so much
Look at projects that were in both categories and create the surveys based on that
Surveys must not be done during working hours.
Send surveys for browsing by stakeholders before sending out to users.
Select users based on the projects selected.
PUI can still be used PARTLY. Because right now there is user communication going on.
March Launch is too close to measure results in entirety.
Hence Tasks will be used from  system requests and enhancement projects


Different user acceptance for different projects
Projects were done differently due to resources and skill set at the time

Company A would have their own change management strategy and if this is mentioned in project it will be properly referenced  that company A did X. So there is no plagiarizing.

When the standards are established it is up to Company A to decide what they will choose to implement.
This is acceptable There is no obligation for the company to adhere to it. These are just recommendations.

Next steps:
  • Define projects I will be working with. These are past projects successful and not so successful ones.
  • Create survey for users. (5250, Genie users based on projects selected).
  • Send stakeholders survey.
  • Distribute to users outside of office hours
  • Collect results

Tuesday, 6 September 2011

Sept 9 - Timeline for Project (0.5)


Sept 6 - Survey work & research (3)

//********************************
// Project Deadline :   March 30th 2011
//
// Read Literature :   2hours
// Planning Survey :   1 hour
//
//********************************


Send out Survey  to Genie and Green Screen Users

These users must also be testing or part of the new PUI changes so that the survey can given to them again to record their feedback for ProfoundUI screens.

This will be the BEFORE and AFTER feedback.

Survey both: Genie and Green Screen Users

Questions will pattern:


Levels of User Acceptance
  • Initial adoption ( when PUI first implemented. By testers and users) 
  • Continued use (continue user without bickering) 
  • Contributing (adding comments and recommendations, enhancements to make the system more useful more efficient) 
  •  Promoting (inviting others)  
Levels of User Acceptance rated between 1 and 5
1 strongly disagree/dislike
2 disagree/dislike
3 neutral
4 agree/like
5 strongly agree/like

Ease of user navigating screens such as function keys and tabbing
Impact of color combination on eyes after long hours on the screen
Extra features/enhancements that the green screen did not offer
Adequate training for enhancement or screens
Adequate support after the training
Users prefer new PUI rather than green screens
Speed of using PUI pages/Browsers issues?
Bugs were fixed in an adequate amount of time
User Recommendations were taken into consideration
Downtime of the application was not noticeable/frequent
Consistent look and feel throughout
Layout of content is in an easy on the eyes way

In what ways did the profoundui  restrain/limit users? Explain
What does the green screen offer that the web page does not do? Explain

What were the user acceptance issues with the green screen????
What were the user acceptance issues with the Genie?????


Survey List to be updated with BA input


************** 


Determine the causes of lack of User Acceptance for Green Screen and Genie


What is the answer????

Sept 9 - Plan (1)

PLAN:

I think the tests can apply to both ProfoundUI and enhancements in GG. Record results.

20 test users
each with their own circles of information buddies and connections

5 green screen users
5 genie users
5 who uses both genie and green screen
5 new users

Select opinion leader and non opinion leader
Select person at hub of other people
Select solitary users

Short term aim is to see if they can influence their peers ( who do not belong to this group of 20 people) to use/have a positive attitude towards the PUI/enhancements.
Ideally choose users who are in the hub of people.

For example Susan is a good person to recruit for testing/engagement because her social network at work is wider than say Jennifer. Susan's chances to pull users over to PUI might be higher probability.

So Susan and Jennifer will still be recruited for test purposes to put to the test this theory





Select users based on Opinion Leaders. This means users who have more media power within the employee clique. This means employees who other employees listen to or look for answers. Some one who leads the pack with followers.

Cash/Claim/Underwriting

Which users/testers were selected so far? which category they fall into?
Natasha falls in which category?
She is green screen user but was able to follow the screens quickly  for the cash screens.
Where is her circle of friends? how many people is she linked to like in the diagram  above.

aim is to see if opinion leader and non opinion leader made any difference in bringing people over to PUI.
aim is to see if people in the hub of users can bring more people over.

Must choose the testers/users carefully to see what aids in making end user acceptance a reality.

This test is necessary because in the past there were opinion leaders who spoke adversely towards Genie and   influenced people who did not even user genie before hence making user acceptance more difficult

By recruiting users being at the hub like Susan ( in diagram) , if it works we can possibly increase the numbers of user buy in. majority in numbers and the few stragglers can fall in with the majority.


if recommendations to use PUI by an opinion leader is being  done then this is an indicator o acceptance as well.

************** 

Recordings will be taken from all now Sept 2011 and on wards of user reaction at different stages.

Testing, Initial launch, Continued use, Feedback from users, recommendations.







Sept 4 - User Acceptance (2)

Define User Acceptance.

Levels of User Acceptance


  • Initial adoption ( when PUI first implemented. By testers and users) 
  • Continued use (continue user without bickering) 
  • Contributing (adding comments and recommendations, enhancements to make the system more useful more efficient) 
  •  Promoting (inviting others)  



Levels of User Acceptance rated between 1 and 5
1 strongly disagree/dislike
2 disagree/dislike
3 neutral
4 agree/like
5 strongly agree/like

Ease of user navigating screens such as function keys and tabbing
Impact of color combination on eyes after long hours on the screen
Extra features/enhancements that the green screen did not offer
Adequate training for enhancement or screens
Adequate support after the training
Users prefer new PUI rather than green screens
Speed of using PUI pages/Browsers issues?
Bugs were fixed in an adequate amount of time
User Recommendations were taken into consideration
Downtime of the application was not noticeable/frequent
Consistent look and feel throughout
Layout of content is in an easy on the eyes way

In what ways did the profoundui  restrain/limit users? Explain
What does the green screen offer that the web page does not do? Explain

What were the user acceptance issues with the green screen????

What were the user acceptance issues with the Genie?????

Based on the GG crowd, younger people tend to navigate to genie web based but older folks are more resistant to change. to move away from what they know.

Conditions that have helped in the past:
It also helps with Genie if there is top down management involvement to help introduce the change to the employees.

Also helps if the managers use the application to break the ice with users.

Work on the opinion leaders. get opinion leaders as the testers

Delimiters in the past:

Genie was not properly managed, therefore proper investigation on machine requirements and upgrades were not done.
No speed test was done.
Opinion Leaders advertised that it was slow, hard on the eyes so news spread and discouraged people from using it even before trying it themselves.

Culture
Culture of company - top down management
Culture of the individual
            - Older folks more resistant to change
            -Younger folks -> willing to take ownership and try new things, more technology/online aware, take              more risk

Individual Reaction to people/managers.
If a manager is disliked and he.she is pushing PUI., due to subconscious messages users might not want to support an initiative taken by that manager. More resistant

Will the past with genie discourage users from trying out the new web version of PUI?

IDEA:

Social Networking Analysis?
People and chain. If I talk to 10 people regularly and you talk to 2 people regularly.
It might be more productive to engage me in  testing for profoundui. Since I can bring in and convince more people to use PUI.

Get opinion leaders as the testers since they influence other people. Once buy in from opinion leaders are found, they can bring other users in.