Home Depot: The Customer Experience

This is the first blog of a new series called “Who Needs Training?”.  This series is designed to provide the consumer’s prospective on the need for staff and management training, based on personal experiences with businesses. We have all encountered both excellent and poor customer service; those experiences are critical to the success of any business. By approaching these experiences in Human Performance Technology, we can identify those who need training and professional development, while appreciating others who have been trained effectively, and apply those behaviors and practices in the workplace. This is by no means a complete Human Performance Technology analysis, but rather feedback from a customer’s view point.

My wife has an amazing green thumb and enjoys caring for her small garden.  Unfortunately, so do the bugs. Her solution was to purchase more bugs – specifically, lady bugs, that are suppose to protect her tomato plants. So today, on October 15, 2012, at approximately 4:30 p.m., I accompanied my wife to Home Depot in search of these protective lady bugs.  I, myself, know nothing about gardening or bugs, so the practical thing to do, is to seek a little more information in the Gardening Section of Home Depot. To my surprise, the customer service experience was terrible. Because of the time of the day, the store was fairly empty so finding employee assistance was very easy, however, their helpfulness ended with their accessibility.  As we approached the cashier in the Garden Section, it seemed as if we startled him, as it appeared that he was either hiding or avoiding doing any work.  When asked if they sold lady bugs, his only response was that he does not normally work in the Garden Section and we should find someone else to ask!

I am a practical individual and I understand that it’s unrealistic to expect staff members to know where every item in the store is located; however, in basic customer service training, all staff should have learned and developed the “soft-skills” necessary to assist customers and their needs. Calling for additional help, taking a moment to reference a directory of product inventory, or responding with a simple phrase such as “I am not sure, let me find out for you” is the difference between good and poor customer service.

My initial response was that this particular staff member was lazy or simply a poor employee, however, as we approached a second staff member, working in the Garden Section, he provided a similar response, and again offered us no additional assistance.  He had indicated that he was new in the Garden Section and thought that the lady bugs were a seasonal item, but was uncertain. Based on this second encounter, and the inability to find any additional staff members trained to assist us in the Garden Section, we left the store with a negative experience rather than our lady bugs.

A classic model used in Human Performance Technology is included below and will be used to analyze the customer experience throughout this blog series.

The Person

The Environment

Skills and Knowledge

Data or Information

Capacity

Tools and Setting

Motives

Incentives

When reviewing the customer experience at Home Depot, we will first address “the person”. Both staff members we interacted with in the Garden Section lacked any knowledge about the Garden Section. Regarding capacity, both individuals appeared to have the ability to possess the skills required to complete the tasks they were presented, however, we can only assume that they have not been trained to develop such skills and knowledge and/or lacked the motivation to provide simple customer service.

From an environmental aspect, neither staff member referred to any reference or resource to answer our question, which leads me to believe that supportive material is not provided to staff members, which is reflective of poor management, failing to provide an environment for staff to succeed. Reflective of the similar responses we were given by multiple staff members, I can conclude that these staff members have not been trained in customer service skills, therefore setting these employees up for failure by not providing the tools necessary to satisfy each customer’s needs. If argued, that these staff members are trained and prepared with everything they need to succeed, then it would appear that there is a lack of incentives for employees, as both these individuals lacked any desire to go above and beyond to find the answer to a relatively simple question regarding product inventory.

To consistently quantify and simplify the customer service experience from a Human Performance Technology approach throughout this blog series, we will refer back to the HPT model and give an “X” for each failed section. Since the model is developed in three sections, the worst score a business can receive is “XXX”, while the best score would be no “X”s. In my opinion, I have scored the Home Depot experience as follows:

Home Depot Experience (10/15/12)

    The Person & The Environment
X Skills and Knowledge & Data or Information
X Capacity & Tools and Setting
X Motives & Incentives

Human Performance Technology Grade: XXX

If you are interested in learning how to maximize Human Performance Technology for your business, please contact Higher Power Training

M-Learning (Thesis Volume 5.3: Implications for Further Research)

As the Ocean Institute Podcast Program continues to build, future research will likely lend towards implementing such a program into other subject matters taught at the Ocean Institute and other informal education centers.  The next step will be to address the Social Science and History programs offered at the Ocean Institute.  Once the pilot program is completed and all assessments have been analyzed, it is expected that by next fall, a Social Science program may begin developing.  Other potential directions may include partnering with the Orange County Performing Arts Center, Scripps Institution of Oceanography, various art museums, other natural history museums and San Diego high schools to disseminate the program and assist in developing similar programs statewide.

By Higher Power Training

M-Learning (Thesis Volume 4.2: Project Evaluation – Data Presentation)

The program was divided into three components: professional development, field trip and podcast production.  As stated earlier, the professional development component is a tangent of the main focus of this report and will not be addressed.  Although the evaluation process is in its early stages, the assessment tools have produced some beneficial results.  Thus far, 100 students from three different high schools and one middle school have participated in the Ocean Institute Program.  Most students have not completed their podcasts at this stage; however a handful of accelerated students have successfully completed their podcasts and submitted the post-test and survey.

The first evaluation was to establish students’ knowledge of the topics addressed on the field trip.  The pre-tests submitted reflected a general knowledge of the topics addressed on the field trip.  The average grade of the pre-test was a 72%.  5% of the students scored a 100% on the test; while 54% scored a 50% or less.    The post-test, although not completed by all students at this point, have shown significant improvement in academic knowledge.  The average increased 24%, while the lowest score was 69%.  Refer to table 1 to compare the results of the pre-tests and post-tests.

The evaluation process continued by analyzing the student surveys submitted; at this point in time, only 22% of the surveys have been completed.  From the surveys it was determined that 80% of the students had never been to the Ocean Institute; there were 2 students who had never seen the ocean.  The survey revealed a number of positive remarks:

  1. “I loved the hands-on activities and all the fish.”
  2. “It was nice to learn outside of the classroom.”
  3. “It was cool to take so many pictures and video of the field trip.”
  4. “It was the first time I have seen the ocean.”
  5. “I wish the field trip was longer.”
  6. “Making a podcast for school was meaningful.”
  7. “I make tons of videos, doing one for school was cool.”
  8. “I hope my podcast gets the most downloads.”
  9. “I learned a lot about podcasting and marine science.”
  10. “I downloaded all of my friends’ podcasts to my iPod.”

The evaluation process also included surveys completed by Ocean Institute visitors who used the podcast while visiting; 90% of the surveys were completed by Ocean Institute members.  The survey revealed that the average visitor listened to 7 podcasts; however, visitors ranged from 1 to 13 podcasts.  85% of the visitors felt the podcasts enhanced the exhibits and made the trip to the Ocean Institute more educational and memorable.  Some of the valuable comments received included:

  1. “I have always enjoyed how the Ocean Institute showcases students’ work; these podcasts were another example of the impact OI has on children.”
  2. “I have been to the Ocean Institute numerous times, however, I learned more on this visit than all the other visits combined.”
  3. “I have used podcasts at other museums, and although they are more professionally produced, I felt having students producing podcasts from their own experiences was very special.”
  4. “Some of the podcasts were outstanding; did not listen to the longer podcasts.
  5. “These students should be proud of what they have developed.”
  6. “I had never been to the Ocean Institute; the podcasts really helped me understand what happens at the facility.”

Although an initial evaluation has been completed by the student tests and surveys, and the Ocean Institute visitor surveys; there has been no formal feedback at this point in time from the teacher surveys.  From what has been observed, many of the uncertainties teachers had were addressed during professional development; there has also been constant support provided during the podcast production process for teachers who felt they needed it.

M-Learning (Thesis Volume 4.1: Project Evaluation)

Being a unique program with few models to follow, the most difficult aspect of the program was to determine how to assess the success of the program.  This assessment was critical to attract participating schools, grant funding, and to measure the overall success of the development of the program.  The foundation of the evaluation was based on measuring the proposed objectives; the objectives were divided into two categories: program objectives and student objectives.  The program objectives included:

  1. Develop a working partnership between Ocean Institute, Orange County high schools, Apple, and Best Buy.
  2. Enhance classroom curriculum by bringing subject matter experts into the classroom via podcasts.
  3. Integrate technology into the established curriculum for students to develop podcasts based on lesson plans.
  4. Design an instructional and motivational platform for students to showcase their conceptual understanding of various subject matters.
  5. Provide professional development for teachers on the advantages of using podcasting and other technology in their classrooms.

The student objectives included:

  1. Research books and field guides to collect the necessary information.
  2. Organize the information they have collected in a way that can be developed into an educational lecture.
  3. Educate their peers on the skates and rays.
  4. Recognize the differences within aquaria.
  5. Use field guides to identify various species.
  6. Complete population counts and assess biodiversity.
  7. Identify keystone species and the importance of the species.
  8. Relate the pollution the biological integrity of the aquaria.
  9. Communicate their findings to their peers.
  10. Efficiently research on the internet to collect required data.
  11. Participate in an open discussion regarding problems and solutions to estuaries.
  12. Collect and organize digital video footage and images depicting their work.
  13. Edit all digital data.
  14. Create an mp4 file with digital data.
  15. Write an instructional script to teach a lesson plan.
  16. Record a voice file and integrate it into mp4 file to match with visual content.
  17. Use their creativity to design and produce an instructional podcast to be used by their peers at the Ocean Institute exhibits.

Measuring many of the student objectives were difficult due to the lack of quantitative results.  Measuring the student objects were completed by monitoring their milestone accomplishments and assessing their final podcast production; to create a unified measuring tool, an assessment guide was created for teachers to evaluate the students’ progress.  Another tool designed to assess the student objectives included a comparison analysis between student pre-tests and post-test results to measure a student’s academic growth.  To continue to improve the motivational and content components, students were also distributed student surveys to complete; the surveys that have been completed have been analyzed.  Once a greater number of surveys are returned, revisions to the program may be deemed necessary.  The final analysis will be to compare state standardized testing results of those students who participated in the program to those who did not; this is a critical assessment in terms of receiving future funding, however due to the time constraints, this analysis has not been completed.

To successfully evaluate the first two objectives, they will need to be redefined with quantitative elements; these original program objectives were broad as a result of an experimental program with no model to follow.  As the pilot program comes to an end, the results of the teacher surveys and visitor surveys will assist in providing a foundation to better redefine the objectives.  Objectives that will require redefining include grant funding needs; this can be better determined once an accurate assessment of cost per class has been completed.  Another objective that requires redefining is the desired number of participating schools both in short term and long term; there are a number of components that will influence this objective including the Ocean Institute’s ability to accommodate schools throughout the program, professional development needs, and budget needs for both schools and the Ocean Institute.  Finally, the professional development seminars contain their own set of objectives, but will not be addressed within this report to prevent tangents.

It should be noted that the each of the evaluations noted above will be an ongoing and continuing process to ensure the program is of the highest quality.  Continuing assessments are also critical due to the frequent updates and changes in technology which will need to be addressed when it becomes an identified issue; the hope is that the surveys will address such issues.

M-Learning (Thesis Volume 3.2: Ethical Considerations)

A project of this scope has provided and will continue to provide students of all demographics the opportunity to take advantage of using the newest technology in learning.  In this era of technology, the desire to integrate technology into learning curriculum is at an all-time high.  Both private and government foundations are eager to financially support programs that provide students with the opportunity to use and learn with technology.  With the diversity of the students in Southern California, the program provided enough flexibility for students to express their creativity by choosing to create the podcast in Spanish if they desired.

The Ocean Institute Podcast Program was designed with a combination of learning theories, behaviorism, constructivist, and multiple intelligences, as the foundation.  The OIPP consisted of five elements to successfully make learning motivational and the produced podcasts useful: professional development for the teachers, the student field trip, the student research, and the podcast production process.  With the aid of the online content and resources, this program combined the flexibility and creativity within each class with the structure and guidance necessary to produce the expected results and achieve the program and student objectives.  A project of this scope has provided and will continue to provide students of all demographics the opportunity to take advantage of using the newest technology in learning.

M-Learning (Thesis Volume 3.1: Project Design – Learning Theory)

The Ocean Institute Podcast Program was designed with a combination of learning theories, behaviorism, constructivist, and multiple intelligences, as the foundation.  The concept of the program to have students produce their own podcasts is a valuable constructivist approach to learning.  The process behind producing the podcasts in groups was based on the theory of multiple intelligences; as students work together and utilize their strengths to produce a high quality podcast.  The reward students received by taking ownership of their project were an element of behaviorism.

The OIPP consisted of five elements to successfully make learning motivational and the produced podcasts useful: professional development for the teachers, the student field trip, the student research, and the podcast production process.  The professional development for the teachers is the most critical aspect to limit the performance gaps within the program as the day to day process of how the program was conducted by each individual teacher; it was essential that the teachers were prepared properly to facilitate the program.  Due to the time constraints and available resources, the professional develop training of the program will not be addressed in detail.  The field trip element of this program was one of two essential motivators for the students.  The field trip design was modeled after pre-existing programs provided by the Ocean Institute.  The modifications were based on the concept that the students must take what they learned on the field trip and apply that into a program monitored at school; this instructional design of incorporating classroom time and the field trip is a unique concept of informal education but perhaps essential in a time when schools must function under extremely strict budgets and activities like field trips are being eliminated by school districts.  The podcast production element was another motivational element, allowing students to take advantage of the latest technology fads in their learning process.   The fact is, school-aged children spend hours on websites like MySpace and YouTube, both producing videos with their cell phones and uploading videos to their iPods; it only makes sense to provide students with a structured process to develop meaningful podcasts that they can be proud of and take ownership of.  The fourth element was the research process conducted by students; this was the element that fits within the State Standards and is the priority when justifying the work in class.  The pedagogical element, the most important element of instructional design, provided the overall structure and process of the program.  With the aid of the online content and resources, this program combined the flexibility and creativity within each class with the structure and guidance necessary to produce the expected results and achieve the program and student objectives.

M-Learning (Thesis Volume 2.8: Review of the Literature)

Student-produced multimedia provides students with the opportunity to learn about technology, express themselves through creativity, and showcase their work to a larger audience.  Student-produced multimedia provides teachers with an efficient way to integrate technology into an established curriculum, design inquiry-based lesson plans, and assess student learning without having to use traditional tests.  One type of student-produced multimedia is digital storytelling, which incorporates images, audio, video, text, and image effects.  When creating a digital story, students develop the skills necessary to research, playwright, design, produce, and educate (Chung, 2007).  Digital storytelling integrates the arts, education, local communities, technology, and storytelling.  According to Chung (2007), students develop and apply multi-literacy skills, aesthetic sensitivities, and critical faculties to address greater issues of importance to a larger audience.  Digital storytelling is applicable for all school subjects, but as Chung (2007) points out, many schools in America have ample funds for maintaining a computer lab while funds for art supplies are either minimal or non-existent.  The implementation of digital storytelling offers art educators another avenue to implement an innovative and relevant art program for the technology-savvy digital generation (Chung, 2007).

One sample of podcasting in elementary schools comes from Jamestown Elementary: To align the podcasts with the curriculum, the teachers created handouts to help students produce their individual segments about a historical person or event from the Jamestown settlement.  The students could create their segments in different ways – as ‘am interview, a report, a poem, a word play, a skit, a Did you Know segment, or any other creative way to of communicating what you know and have learned’ (Long, 2007).  Producing podcasts can help students identify their strengths and help them to showcase their talents while working together in groups to produce a product that can be viewed worldwide.  By producing podcasts in groups, the creative writers record poetry, stories, or skits; the artists provide drawings or photography; musicians produce songs; and the technicians piece it all together (Long, 2007).

Digital storytelling is an example of a constructivist approach, which puts interactive technologies in the hands of student producers.  According to Brown (2007) when students are given creative freedom to construct with multimedia tools in an activity that is personally meaningful, they exhibit high levels of motivation and task engagement, develop skills through directed and needs driven episodes, exhibit higher order thinking, and individual differences are valued, accentuated, and expressed through interface design.  One approach to designing student produced multimedia for web based classrooms is to use competency-based learning (CBL), which is self-directed, individual, and a mastery learning method allowing students to achieve predetermined competency standards with the master knowledge and skills that they have learned (Chang, 2007).  According to Chang, since web learning has recently gained much attention in college, CBL on the Web has a certain level of demand and feasibility.