Home Depot: The Customer Experience

This is the first blog of a new series called “Who Needs Training?”.  This series is designed to provide the consumer’s prospective on the need for staff and management training, based on personal experiences with businesses. We have all encountered both excellent and poor customer service; those experiences are critical to the success of any business. By approaching these experiences in Human Performance Technology, we can identify those who need training and professional development, while appreciating others who have been trained effectively, and apply those behaviors and practices in the workplace. This is by no means a complete Human Performance Technology analysis, but rather feedback from a customer’s view point.

My wife has an amazing green thumb and enjoys caring for her small garden.  Unfortunately, so do the bugs. Her solution was to purchase more bugs – specifically, lady bugs, that are suppose to protect her tomato plants. So today, on October 15, 2012, at approximately 4:30 p.m., I accompanied my wife to Home Depot in search of these protective lady bugs.  I, myself, know nothing about gardening or bugs, so the practical thing to do, is to seek a little more information in the Gardening Section of Home Depot. To my surprise, the customer service experience was terrible. Because of the time of the day, the store was fairly empty so finding employee assistance was very easy, however, their helpfulness ended with their accessibility.  As we approached the cashier in the Garden Section, it seemed as if we startled him, as it appeared that he was either hiding or avoiding doing any work.  When asked if they sold lady bugs, his only response was that he does not normally work in the Garden Section and we should find someone else to ask!

I am a practical individual and I understand that it’s unrealistic to expect staff members to know where every item in the store is located; however, in basic customer service training, all staff should have learned and developed the “soft-skills” necessary to assist customers and their needs. Calling for additional help, taking a moment to reference a directory of product inventory, or responding with a simple phrase such as “I am not sure, let me find out for you” is the difference between good and poor customer service.

My initial response was that this particular staff member was lazy or simply a poor employee, however, as we approached a second staff member, working in the Garden Section, he provided a similar response, and again offered us no additional assistance.  He had indicated that he was new in the Garden Section and thought that the lady bugs were a seasonal item, but was uncertain. Based on this second encounter, and the inability to find any additional staff members trained to assist us in the Garden Section, we left the store with a negative experience rather than our lady bugs.

A classic model used in Human Performance Technology is included below and will be used to analyze the customer experience throughout this blog series.

The Person

The Environment

Skills and Knowledge

Data or Information

Capacity

Tools and Setting

Motives

Incentives

When reviewing the customer experience at Home Depot, we will first address “the person”. Both staff members we interacted with in the Garden Section lacked any knowledge about the Garden Section. Regarding capacity, both individuals appeared to have the ability to possess the skills required to complete the tasks they were presented, however, we can only assume that they have not been trained to develop such skills and knowledge and/or lacked the motivation to provide simple customer service.

From an environmental aspect, neither staff member referred to any reference or resource to answer our question, which leads me to believe that supportive material is not provided to staff members, which is reflective of poor management, failing to provide an environment for staff to succeed. Reflective of the similar responses we were given by multiple staff members, I can conclude that these staff members have not been trained in customer service skills, therefore setting these employees up for failure by not providing the tools necessary to satisfy each customer’s needs. If argued, that these staff members are trained and prepared with everything they need to succeed, then it would appear that there is a lack of incentives for employees, as both these individuals lacked any desire to go above and beyond to find the answer to a relatively simple question regarding product inventory.

To consistently quantify and simplify the customer service experience from a Human Performance Technology approach throughout this blog series, we will refer back to the HPT model and give an “X” for each failed section. Since the model is developed in three sections, the worst score a business can receive is “XXX”, while the best score would be no “X”s. In my opinion, I have scored the Home Depot experience as follows:

Home Depot Experience (10/15/12)

    The Person & The Environment
X Skills and Knowledge & Data or Information
X Capacity & Tools and Setting
X Motives & Incentives

Human Performance Technology Grade: XXX

If you are interested in learning how to maximize Human Performance Technology for your business, please contact Higher Power Training

Advertisements

M-Learning (Thesis Volume 5.3: Implications for Further Research)

As the Ocean Institute Podcast Program continues to build, future research will likely lend towards implementing such a program into other subject matters taught at the Ocean Institute and other informal education centers.  The next step will be to address the Social Science and History programs offered at the Ocean Institute.  Once the pilot program is completed and all assessments have been analyzed, it is expected that by next fall, a Social Science program may begin developing.  Other potential directions may include partnering with the Orange County Performing Arts Center, Scripps Institution of Oceanography, various art museums, other natural history museums and San Diego high schools to disseminate the program and assist in developing similar programs statewide.

By Higher Power Training

M-Learning (Thesis Volume 4.2: Project Evaluation – Data Presentation)

The program was divided into three components: professional development, field trip and podcast production.  As stated earlier, the professional development component is a tangent of the main focus of this report and will not be addressed.  Although the evaluation process is in its early stages, the assessment tools have produced some beneficial results.  Thus far, 100 students from three different high schools and one middle school have participated in the Ocean Institute Program.  Most students have not completed their podcasts at this stage; however a handful of accelerated students have successfully completed their podcasts and submitted the post-test and survey.

The first evaluation was to establish students’ knowledge of the topics addressed on the field trip.  The pre-tests submitted reflected a general knowledge of the topics addressed on the field trip.  The average grade of the pre-test was a 72%.  5% of the students scored a 100% on the test; while 54% scored a 50% or less.    The post-test, although not completed by all students at this point, have shown significant improvement in academic knowledge.  The average increased 24%, while the lowest score was 69%.  Refer to table 1 to compare the results of the pre-tests and post-tests.

The evaluation process continued by analyzing the student surveys submitted; at this point in time, only 22% of the surveys have been completed.  From the surveys it was determined that 80% of the students had never been to the Ocean Institute; there were 2 students who had never seen the ocean.  The survey revealed a number of positive remarks:

  1. “I loved the hands-on activities and all the fish.”
  2. “It was nice to learn outside of the classroom.”
  3. “It was cool to take so many pictures and video of the field trip.”
  4. “It was the first time I have seen the ocean.”
  5. “I wish the field trip was longer.”
  6. “Making a podcast for school was meaningful.”
  7. “I make tons of videos, doing one for school was cool.”
  8. “I hope my podcast gets the most downloads.”
  9. “I learned a lot about podcasting and marine science.”
  10. “I downloaded all of my friends’ podcasts to my iPod.”

The evaluation process also included surveys completed by Ocean Institute visitors who used the podcast while visiting; 90% of the surveys were completed by Ocean Institute members.  The survey revealed that the average visitor listened to 7 podcasts; however, visitors ranged from 1 to 13 podcasts.  85% of the visitors felt the podcasts enhanced the exhibits and made the trip to the Ocean Institute more educational and memorable.  Some of the valuable comments received included:

  1. “I have always enjoyed how the Ocean Institute showcases students’ work; these podcasts were another example of the impact OI has on children.”
  2. “I have been to the Ocean Institute numerous times, however, I learned more on this visit than all the other visits combined.”
  3. “I have used podcasts at other museums, and although they are more professionally produced, I felt having students producing podcasts from their own experiences was very special.”
  4. “Some of the podcasts were outstanding; did not listen to the longer podcasts.
  5. “These students should be proud of what they have developed.”
  6. “I had never been to the Ocean Institute; the podcasts really helped me understand what happens at the facility.”

Although an initial evaluation has been completed by the student tests and surveys, and the Ocean Institute visitor surveys; there has been no formal feedback at this point in time from the teacher surveys.  From what has been observed, many of the uncertainties teachers had were addressed during professional development; there has also been constant support provided during the podcast production process for teachers who felt they needed it.

M-Learning (Thesis Volume 3.2: Project Design – Procedure)

A number of Ocean Institute programs are grant funded; many with the requirement of dissemination.  One platform of dissemination has been to highlight the programs on weekends for public visitors to talk to students and view their work.  This process has provided the Ocean Institute with positive feedback and appreciation from visitors, parents, teachers, and students.  The concept of OIPP was that podcasts can provide similar feedback, while elevating the education for both the students and visitors to another level.  From this original concept, grant proposals were written to fund such a program; the idea was to find a small grant to develop a pilot program.  Once the pilot was completed and if found to be potentially successful, attempts to receive larger grants would be desired.  Once the funds were in place the design process began.

  • Step 1:  Potential improvements for the weekend open house were identified.
  • Step 2:  An investigation of various technology platforms to present educational information was conducted.
  • Step 3:  The demographics of the target audience were identified.
  • Step 4:  Both student and program objectives were identified.
  • Step 5:  Assessments were identified to correlate with the objectives.
  • Step 6:  Partnerships with the Orange County Stellar Technology High Schools were developed.  A meeting with members of the high schools was held to brainstorm content, delivery, and other aspects of the program.
  • Step 7:  The field trip and program content was developed.
  • Step 8:  Through out the instructional design process, a series of tests performed to measure the usability and progress of the program.  The tests were performed at the middle and end of the instructional design process.  The tests were completed by Ocean Institute program developers, Ocean Institute instructors, Ocean Institute student volunteers, and teachers from participating schools.
  1. Exploratory Testing: The first test performed was the exploratory test. This test took place in the middle of the instructional design process and was completed by both Ocean Institute program developers and instructors.  The test instruments included an invitation to participate in the test, a test script explaining the purpose of the test, sample podcast, program assessment questionnaire, and group exit interview.  At the end of the testing, modifications were made by the recommendations of the staff.  The test exposed flaws in instructional design, directions, time allocation, and general problems that may arise.
  2.  Assessment Test: At the end of the instructional design, an assessment test was completed by a few of the participants that completed the exploratory tests along with Ocean Institute student volunteers.  The test instruments included an invitation to participate in the test, a test script explaining the purpose of the test, sample podcast, program assessment questionnaire, and group exit interview.  The results obtained from completing the program, along with the program assessment questionnaire, and exit interview provided direction for program modification prior to the validation testing.  Again, testing exposed flaws in design, equipment, directions, and general issues that may arise.  The podcasts developed from the assessment test provided excellent examples for students participating in the program in the future.  The test instruments included an invitation to participate in the test, a test script explaining the purpose of the test, program assessment questionnaire, and group exit interview.
  • Step 9:  Once the content was developed, the appropriate platforms for delivery were identified and developed; this included the program website, flash videos, PDF files, online tests, and other resource links.
  • Step 10:  An eight hour professional development seminar was designed; the process will not be addressed due to time constraints and thesis focus being on the actual Ocean Institute Podcast Program and not on the professional development course.
  • Step 11:  Professional development was held to review the program curriculum, introduce and navigate through the software, and podcast were produced by teachers.  A validation test would be conducted at the end of the seminar.
  1. Validation Test: The final test distributed was the validation test.  This test provided input by people not associated with the Ocean Institute.  Teachers participating in this program had an opportunity to take the validation test providing the final input before the program was finalized.  The teachers represented grades 6th through 12th in various school districts.  The validation test helped verify that the program met all needs required by teachers, and encouraged them along with other teachers to participate in the program.  The test instruments included an invitation to participate in the test, a test script explaining the purpose of the test, sample podcast, and program assessment questionnaire.
  • Step 12:  Suggestion, comments, and the results of the validation tests during the professional development were analyzed; changes were implemented where deemed necessary.
  • Step 11:  Program surveys were designed for participants for data analysis.

M-Learning (Thesis Volume 2.2: Review of the Literature)

I am a firm believer that although some students may fail under the constraints of formal education, those same students may continue to learn and succeed with the aid of alternative learning provided throughout our society.  Dr. Gardner states that “Human beings have tremendous capacities to learn and develop, as can easily be seen if one watches a child actively exploring his environment during the first years of life.” (Gardner, 2004, 249)  I am completely in agreement with Dr. Gardner but question the cause of limiting a person’s capacity for learning as they mature.  What in human nature, society, or education prevents children from continuing to learn at the same capacity as their first year of life?  In my opinion, the structure of education places the constraints of students’ ability to learn.  This opinion is backed by Dr. Gardner who points out that “educators should exploit the cognitive and affective powers of the five-year-old mind (an energetic, imaginative, and integrating kind of learner) and attempt to keep it alive in all of us.” (Gardner, 2004, 250)

Dr. Gardner proposes four reforms to assist in the improvement of the educational system.  The first deals with how students are assessed, the second is concerned with the quality of curriculum, the third addresses teacher practices in the classroom, and the final component is community support.  Although my opinions are somewhat mixed with Dr. Gardner’s critique of the educational system, my opinions are in complete agreement with Dr. Gardner’s educational reform.  Assessment is critical to any program, including the educational system.  Standardized testing is a poor evaluation due to the fact that it is only measuring the student’s ability to regurgitate information.  Building portfolios as mentioned by Dr. Gardner is a much truer evaluation of students’ ability because it would allow students to problem-solve, use creativity, and have a deeper understanding of the information presented in class.  As stated by Dr. Gardner, “unless the accompanying curriculum is of quality, the assessment has no use.”  (Gardner, 2004, 254)  Both the second and third reforms dealing with curriculum and teachers are related to one another.  Although Dr. Gardner does not go into great detail about improving the quality of the curriculum, he does note that professional development is critical for teachers to improve the quality of teacher practices in the classroom.  I am a firm believer that professional development is necessary for teachers to continue to improve in their professions as well as introduce new ideas, information, and technology that may help teachers improve the quality of learning in their classrooms.  The fourth and final reform addresses the need for community support.  Dr. Gardner calls for the local communities to be active in the schools.  I think this is very important when motivating students for the future as well as introducing real-world applications for information taught in the classroom.

With the need to improve education and trainings, many have turned to technology as an effective learning tool.  Simulations and video games are currently being utilized in school classrooms, businesses, military, museums, flight training, and NASA.  The potential benefits of video games and simulations include improved reading skills, logical thinking, observation skills, vocabulary development, problem solving, and strategy planning.  With such a diversified demand for simulations and educational video games, many software companies have recently risen to the challenges to offer programs to fit any need.

M-Learning (Thesis Volume 1.5: Delimitations/Definitions)

Delimitations

Working with the resources of a nationally recognized educational facility aided in implementing such a systemic informal educational program into classrooms.  Obvious limitations were addressed and minimized to allow the program to reach is fullest potential; limitations included the willingness of participating schools and teachers, financial support to operate a technology savvy program, and time necessary to complete the program successfully.  The Ocean Institute has a well established reputation in the local educational community; however convincing schools and teachers to participate in an experimental program was very difficult.  This issue was addressed by contacting and working with the Orange County Stellar Technology High Schools; these schools are funded to participate or develop such experimental programs. The participation of these schools provided enough students, teachers, and feedback to move forward with a pilot program.  In this era of restricted budgets, finances must also be address.  Although these schools receive funds to cover many of the elements required by a program like this one, including computers and the software; there were many other elements that required financial support including the field trip costs such as buses and substitute teachers, professional development, iPods, and Ocean Institute staffing.  To reconcile this problem, a small grant was rewarded to the Ocean Institute allowing the program to move forward; this grant provided the support to rent buses, pay substitute teachers, purchase a limited quantity of iPods, pay Ocean Institute staff, and provide professional develop for participating teachers.  In recognizing constant need for financial support in such a program, a strong relationship must be developed between the grant funder and the Ocean Institute.  The third critical limitation was designating the necessary time required for such a program.  Time constraints occurred with professional development training, program feed back from all participants, and the classroom time necessary to complete the podcasts.  Field trips have become limited in schools not only due to budget constraints but also the high demand on standardized testing; to ask teachers to spend approximately 32 hours of classroom time to participate in a pilot program was difficult.  Fortunately, enough teachers stepped forward in eagerness to participate.  Although limitations for this program had little hindrance on the overall design and outcome of the program, the data analysis and proven success is of the essence if this program is to continue beyond this pilot program.

Definitions

For purposes of this project, the following words are defined:

  • IPods:  Portable media players produced by Apple that play specific digital media formats including mp3 and mp4.
  • Mp3:  One digital media format used by digitally created audio files.  These files are recognized by iTunes and QuickTime player.  They can be downloaded played on iPods.
  • Mp4:  One digital media format used by digitally created video files.  These files are recognized by iTunes and QuickTime player.  They can be downloaded played on iPods.
  • Podcasts:  Media files, most commonly found in mp3 or mp4 formats that are distributed over the Internet for playback on personal computers and portable media players.  Podcasting refers to the distribution of media files by syndication feeds through which new files are automatically downloaded to subscribers, but media files downloaded manually from the Internet are also generally referred to as podcasts (Copley, 2007).
  • Informal Learning:  Generally refers to learning that occurs outside the traditional, formal school realm.  These sites range from museums and science centers to casual areas that some might not even notice for their potential as educational venues (McComas, 2006).
  • Ocean Institute Visitors:  The Ocean Institute is a closed campus informal education center serving more than 90,000 students a year.  On the weekends the Ocean Institute opens its doors to the general public much like an open house, providing exhibits, instructional programs, and marine animals to visitors.

M-Learning (Thesis Volume 1.2: Background of the Study )

Learning about science is a lifelong process; science is a constant ever-changing subject.  It is one of those few subjects that most people find to be interesting in one aspect or another; whether you are looking to research the latest diet fads, visit a local zoo or museum, or wants to buy the newest technology on the market; science is functional in everyone’s lives.  Individuals rarely engage in science learning in order to become experience in a field or science, or to achieve a certain level of generalized scientific knowledge and skills.  Despite these findings though, the American public has consistently demonstrated a deep and abiding interest in personally relevant science and technology topics, suggesting that many people find science and technology interesting enough to pursue in their free time and that they engage n many kinds of activities that hold the potential for learning about science (Falk, Storksdieck, and Dierking).  In fact, the National Science Board conducted a survey in 2004 where only 14% of those surveyed had not been to a museum, zoo, or library in the previous year (McComas, 2007).  The hands-on or psycho-motor element of learning is a central design element in many informal settings.  At these sites one finds buttons to push, levers to pull, experiments to try, and paths to walk.  There are unusual animals, puzzling questions, interesting equipment, and endless ways to put these objects and experiences together and discover new and meaningful insights.  The opportunities for these tactile experiences to affect the affective domain and result in new learning are vast (McComas, 2007).

The biggest challenge for informal learning centers is to create lesson plans that result in efficient and measurable results.  If a designer builds particular learning goals into an exhibit and the museum visitors fail to grasp the intended lesson, it may be reasonable to suggest that the museum has failed.  If students visiting a science center return to school with new misconceptions or without having tied the experience to the school curriculum, the experience may be deemed unsuccessful.  The learner in these situations does not fail, but designers, tour guides, and teachers may if they do not take responsibility for considering the learning and facilitating knowledge construction (McComas, 2007).