Saturday, 29 October 2011

Sept 10 - Survey Work for GG

Sat 10th Sept 2011
//  *********************************************
//  Project Deadline                                : March 30Th 2011
//
//  Generate Survey and select projects  :
//  *********************************************

Generate Survey
Select Projects Successful and less successful projects/enhancements/system requests

Oct 29 - Mon Tue Testing session at GG (16 hours)

**********************
Training/Testing session to introduce the users to ProfoundUI in Company 1
 2 days of testing  16 hours
**********************

Training Testing sessions were done Mon and Tue for 2 full days.

Brought in users from different departments 5250 users and genie users.

Slow and fast processors. they got the process transactions what they do daily as well as another area where they might not be familiar with.

So they had to test all areas. This was done with the new ProfoundUI skin or interface.

They were initialed to it for the first few hours then go. they would test as if they were processing in the live environment.
Any issues found if it stopped them they were fixed almost immediately.

it was a one on one test session. ie User was using the system while BA and Programmers walked around and noted issues, positives and negatives.

Previous issues of previous interface - genie were fixed. ie speed. so it was not the same set backs as before.

Over all - positive feedback. 4 out of 6 users said they could use the system now as it and be fine.


outline: BA tested and programmer got all major issues out of the way asap for User testing on these 2 days then the users tested. gave feedback and then bugs were collated to be fixed again.

Because most of the issues were already out of the way it was not traumatizing to the user and they could process throughout without stopping.

User concerns, recommendations were taken into consideration so they had a voice, they were empowered to make comments and it would be listened to.

Cross section of users tested so it was a good test group to represent the masses.

Recommendations were minor. No major changes. Over all they were satisfied

The next training session will be on when all the islands come for training which is 3rd week in Nov 2011.
And each user testing will be a market person in their are for the product.

Who knows maybe this group will go on to tell positive stories to users downstairs. Promoting the name of the interface. and encouraging buy in.

Oct 29 Initial Report (1.5)

Initial Report

1.5 hours
****
Completed and emailed in to supervisors

Wednesday, 19 October 2011

Oct 19 - Literature Review (2)

2 hours
literature review
5 months left

reviewed an article from acm:


PRACTICE MAKES USE: USING INFORMATION TECHNOLOGIES BEFORE IMPLEMENTATION AND THE EFFECT ON ACCEPTANCE BY END USERS

User acceptance of technology
Top down management support
Provide training,
Communication with users,
Allow users to contribute and participate in decision making
Ensure that users do not feel threatened by job insecurity when introducing IT. Do this by training them

Davis (1989; Davis, Bagozzi, and Warshaw, 1989)
examined perceived ease of use and perceived usefulness
and its relationship to acceptance of information
technologies. Perceived ease of use was defined as the
degree to which a prospective user expects the technology
to be free of effort.
Perceived usefulness was a
prospective user’s perception that a specific technology will
enhance his or her job performance. Perceived usefulness
focuses on the payback to the end user for using the
information technology and is anticipated to relate directly
to end users’ expectations about the information
technology (Kleintop, 1993a).

Perceived usefulness was not a significant predictor of
actual use when entered in Block 2 of the hierarchical
regression anal yses shown in Table S. This is not
consistent with Davis, et al. (1989) who found that
perceived usefulness was predictive of system usage.

Sunday, 16 October 2011

Oct 16 - Literature Review (3)

***************
Sun 16th Oct 2011
Time spent: 3 hours
Deadline 5 months
***************

Continuing with Literature Review:

http://doit.maryland.gov/SDLC/Documents/User_Acceptance.pdf
Definition of User Acceptance

User acceptance is the confirmation, through testing, that the delivered system meets all
requirements, functions according to design parameters, and satisfies all business, technical, and
management stakeholders.


Effective user acceptance criteria:
• Are based on functional and non-functional requirements
• Are specific, unambiguous, measurable, attainable, and realistic
• Contain specific pass or fail criteria
• Address all aspects of the system in detail
• Indicate if a requirement is critical or non-critical (i.e., not required for acceptance)
• Identify unacceptable errors


The execution of user acceptance testing is most successful when:
• Predefined and approved user acceptance criteria exist
• Users of the system perform the tests
• The test environment simulates real-world usage conditions
• Tests are conducted on a completed system that has passed unit testing, integration testing,
and system testing
• Tests are conducted utilizing test cases that cover all scenarios.  Test cases describe the
functionality (scenario) being tested, input, expected result, actual result, pass/fail status and
rectification strategy for the problems discovered, test run date and time, name of the
person/role who ran the test
• All data has been migrated/converted (if applicable) prior to user acceptance testing
• Test case execution is automated with test scripts (when possible)

Saturday, 15 October 2011

Oct 16 - Literature Review (3)

********************************
Today: 3 hours
Literature review.

Deadline 5 months
Deadline march 30th 2012

********************************

Have to create structure/chapters of the final report.

Training Mon and Tue GG - Must monitor user feedback.

Perhaps we can tap into the answers initially there.

Plan was try to fix as much bugs as possible so when Actual users test Mon and Tue they will get the issues additional bugs.
 Maybe we can introduce support at this time to listen, set the expectations this is a testing session not a training session.

Initial reaction can cloud later and spread rumors.
Hence it is important to send the right message and get it right the first time. positive message so positive messages can be spread.

file:///C:/Users/owner/Desktop/user%20acceptance%20thur%2025th%202011/Failed%20IT%20Projects%20(The%20Human%20Factor)%20paper.htm


Most software development projects fail because of failures within the team running them. Before a team project can be labeled as failed, the term failed must be defined. For the purpose of this research, a failed software development project is defined as (1) being over-scheduled by more than 30% and/or, (2) being over-budget by more than 30% and/or, (3) the end product did not meet user requirements. These three criteria cover the schedule, budget, and requirement, all surrounded by quality that should comprise all projects. A failed project does not include canceled projects relating to changes in requirements that resulted from changes in business needs. The focus of this research is on the project team problems of these failed projects, thus resulting in team failure.

http://www.it-cortex.com/Stat_Failure_Cause.htm

The main reasons why systems fail to meet their objectives were identified as:
bulletLack of attention to the human and organizational aspects of IT.
bulletPoor project management.
bulletPoor articulation of user requirements.
bulletInadequate attention to business needs and goals.
bulletFailure to involve users appropriately.

I want to stress on last one to some extent not just user involvement but using it appropriately


http://goldpractice.thedacs.com/practices/mr/index.php






this is important.

how does the strategy support this?