Behavioral history and pigeons' "guiding cues" performance.
Traditionally, each response in a response sequence (i.e., a behavior chain) is thought to be jointly controlled: A discriminative stimulus sets the occasion for the response. A stimulus change reinforces the response and sets the occasion for the subsequent response in the sequence. Correctly executing the entire response sequence is reinforced after the terminal response. For example, a simple response sequence in a rodent operant chamber might be: left light illuminated (discriminative stimulus) [right arrow] left lever press (response) [right arrow] left right extinguished/right light illuminated (discriminative stimu-lus) [right arrow] right lever press (response) [right arrow] right light extinguished/ food delivered (reinforcer). Behavioral researchers have long been interested in how consequences for responding (i.e., reinforcement) affect behavior in response sequences (e.g., Reed et al. 1991). An important research question that has not been addressed until recently is how learned response sequences are affected by changes to discriminative stimuli that were present during acquisition of the response sequence.
Researchers have begun to investigate how changing or removing the discriminative stimuli that "guide" the response sequence during acquisition affect skill maintenance (e.g., Reid et al. 2010). Changes in discriminative stimuli can be highly disruptive--even when new discriminative stimuli are highly predictive of reinforcement. In fact, changes to equally predictive discriminative stimuli can be more disruptive than removing discriminative stimuli altogether (Reid et al. 2013). If particular changes in the way consequences are signaled reduce the frequency with which the reinforced response sequence occurs, specifying those changes would help behavioral scientists develop methods for maintaining response-sequence accuracy at high levels in dynamic, unpredictable circumstances when discriminative stimuli may be changed or removed. Researchers might also begin to develop better theoretical accounts for why such disruptions occur, and most importantly, why some stimuli that are highly predictive of reinforcement don't control behavior. A better understanding of how changes to discriminative stimuli affect learning has far-reaching implications for teaching and maintaining motor skills in humans. The objectives of the two experiments that follow were to a) further test a theoretical explanation that has been used explain the disruptiveness of reversing discriminative stimuli associated with an acquired response sequence (Experiment 1), and b) reduce the extent to which the reversal in discriminative stimuli disrupts response-sequence accuracy by manipulating behavioral histories with the discriminative stimuli in question (Experiment 2).
Previous response-sequence learning studies have focused on how reinforcement can organize behavior and create response units (e.g., Bacha-Mendez et al. 2007; Reed et al. 1991; Schneider and Davison 2005; Schneider and Morris 1992). Other research has investigated how choice and timing behavior adjust in dynamic environments when discriminative stimuli do not change, but the contingencies of reinforcement do (e.g., Kyonka and Grace 2007, 2008). In all of these general circumstances, behavior changes in ways that are highly adaptive for maximizing rate of reinforcement.
Researchers are becoming more interested in how changes in discriminative stimuli affect behavior when the reinforced response sequence remains unchanged. In several experiments, Reid and colleagues have evaluated how response sequences develop and are affected by changes in discriminative stimuli. For example, Reid et al. (2001) found that if a tone was presented at the beginning and end of a response sequence, acquisition was faster than when the same response sequence was learned without the tones.
Reid et al. (2010) exposed a group of rats to a "Follow-the-Light" task. Under this procedure, left-right (L-R) lever-press response sequences were reinforced with food. The illumination of lights above each lever served as discriminative stimuli for each response. At the start of each trial the light above the left lever was lighted. The programmed consequence of a press to that lever was that the light was extinguished and the light above the right lever was lighted. If the rat then pressed the right lever, all lights were extinguished and food was delivered. Once all rats produced the L-R sequence on at least 80 % of trials (i.e., performed at above 80 % accuracy), they were split into two groups. The lights were eliminated for the "No-Lights" group. For the "Reversed-Lights" group the lights above the non-active lever in the sequence--right-left--
After five sessions in the No-Lights condition, response-sequence accuracy in the No-Lights group was approximately 60 %--lower than at the end of the Follow-the-Light condition, but still well above chance (25 %). However, after five sessions in the Reversed-Lights condition, response-sequence accuracy in the Reversed-Lights group was 5 % across subjects. Most of the response sequences during the Reversed-Lights condition were R-L, indicating responding was controlled by the discriminative properties of the lights. The authors concluded there was "stronger stimulus control by the 'misleading' lights and weaker control by the history of the subjects' own behavior" (p. 514-515). In other words, the authors suggested two potential sources of behavioral control: environmental cues (panel lights) and practice cues, which are the result of repeating the same sequence many times (Lanai 1975; Shimp 1981, 1982). They explained the results in terms of an overshadowing effect--the panel lights overshadowed control by practice cues when the lights were reversed.
Two theoretical accounts have been considered as explanations for why switching from 'Follow-the-Lights' to 'Reversed-Lights' is particularly disruptive of response-sequence accuracy. The spatial S-R compatibility effect (Fitts and Seeger 1953; Hommel 1995, 2011; Kiernan et al. 2012; Proctor and Reeve 1990) holds that reaction times should be faster when the discriminative stimulus and the correct response are spatially compatible, or adjacent, because a reflex or a "spatial code" orients behavior toward the source of stimulation. The Simon Effect is a more complex version of S-R compatibility in which stimulus position is irrelevant (Simon 1969; Urcuioli et al. 2005). Applied to the two-lever response sequence in Reid et al.'s (2010, 2013) experiments, both versions of this theory predict response sequences will be more accurate and faster when the illuminated light is above the lever on which a response was required than when it is above the lever to be avoided. Therefore, it can adequately account for disruptions in response-sequence accuracy during transitions from Follow-the-Lights to Reversed-Lights.
The other theoretical explanation that has been considered is feature-positive discrimination bias (Hearst 1991; Jenkins and Sainsbury 1969, 1970; Lotz et al. 2012; Nallan et al. 1984; Sainsbury 1973). It is a tendency for organisms to discriminate the presence or addition of a stimulus more readily than the absence or removal of a stimulus. A feature-positive arrangement is one in which the presence of a stimulus signals reinforcement and a feature-negative arrangement is one in which the absence of a stimulus signals reinforcement. In Reid et al.'s (2010, 2013) experiments, the Follow-the-Lights condition could be considered a feature-positive condition, while the Reversed-Lights condition could be considered a feature-negative condition. The theory follows that the presence of the feature-positive lights will prevent control by the other discriminative stimuli that are present, such as practice cues from repeating the same L-R sequence. The theory predicts that responses will be biased toward the feature-positive lights and can explain why response-sequence accuracy was below chance in Reid et al.'s Reversed-Lights conditions.
Reid et al. (2013) considered the major strengths and limitations of each of these potential explanations. They concluded that both explanations could account for most of Reid et al.'s (2010, 2013) simple binary transitions from one condition to another, but neither could account for other aspects of their data, especially in ABA procedures. One limitation of the S-R compatibility explanation is that it cannot account, without the addition of post-hoc assumptions, for Reid et al.'s (2013, Experiment 3) contrary finding that illuminating both lights produced slower and less responding (approximately 30 % fewer trials per session) than when neither light was illuminated. Spatial S-R compatibility explicitly predicts faster responding when the spatial locations of the stimulus and response are in agreement, as they were when both lights were illuminated.
As mentioned, the feature-positive discrimination bias can explain some of the previous response-sequence rat data (Reid et al. 2010, 2013). By using pigeons as subjects and illuminated keys as stimuli in Experiment 1, the feature-negative conditions were eliminated and the adequacy of the feature-positive bias explanation was further tested. If response-sequence accuracy is similarly disrupted in an all-feature positive reversed-cues condition as it was in Reid et al.'s Reversed-Lights condition, then the feature-positive discrimination bias theory cannot account for the disruption.
In Experiment 1, pigeons were exposed to a left-peck, right-peck response sequence contingency and effects of changes in discriminative stimuli were assessed. Specifically, the experiment was designed to assess how experience with stimuli formerly correlated with extinction (S-stimuli) might affect behavior in the presence and absence of stimuli formerly correlated with reinforcement (S+ stimuli) when the underlying response sequence (left peck-right peck) necessary for reinforcement remained unchanged. In Experiment 1, all S+ and S-stimuli in all conditions were positive features: lighted response keys of differing colors.
Subjects This research was approved by West Virginia University's Animal Care and Use Committee. Four pigeons numbered 301 to 304 maintained at 85 % of their free-feeding body weight via appropriate post-session feedings served as subjects for Experiment 1. Pigeons were housed individually in a vivarium with a 12-hr light:dark cycle with continuous free access to water and intermittent access to grit. All pigeons had previous experience pecking keys for access to food in a variety of schedules of reinforcement.
Apparatus Four standard operant chambers (25.5 cm deep X 32 cm wide X 33.5 cm high) enclosed in sound-attenuating boxes equipped with ventilation fans were used. Each chamber contained three response keys arranged 6 cm apart and 24 cm above the floor of the chamber. Response keys could be lighted red, green, or white. A grain hopper (5.5 cm high X 6 cm wide) was located below the middle response key and 5.5 cm from the floor. A house light was located at the top of the chamber on the wall opposite of the response keys. The grain hopper aperture was lighted during reinforcer presentation and the hopper contained mixed grain. A force of approximately 0.15 N was required to register a response on any key. All experimental events were controlled through a computer and MED-PC It' interface located in an adjacent room.
Procedure Because all pigeons had previous experience pecking lighted keys for food, training began immediately. All sessions lasted for 72 trials or 60 min, whichever came first. All four pigeons were first trained on a left key (L)-right key (R) response sequence, in which red lights served as S+ stimuli for two sessions. Pigeon 303 received eight additional sessions of this training because of a failure to acquire the L-R sequence at above 80 % accuracy. During the training period, only the red lights were present on the active key in the L-R sequence. A peck to the red left key extinguished it and lighted the right key red. If the next peck was to the lighted right red key, it resulted in access to food. However, if the next peck was to the left key, it resulted in a 10-s blackout followed by presentation of a new trial.
After training, all four pigeons were exposed to the Guiding-Cues condition. Figure 1 is an event-sequence schematic for the Guiding-Cues, Reversed-Cues, and No-Cues conditions described below. At the start of a trial, the house light was lighted, the left key was lighted red and the right key was lighted white. If the first peck was to the right key, the color of the right key was changed from white to red and the color of the left key from red to white. The next peck on either key resulted in a 10-s blackout during which all lights were extinguished, followed by the start of the next trial. If the first peck in a trial was on the left key, the color of the left key was changed from red to white and the right key from white to red. A second peck on the left key resulted in a 10-s blackout during which all lights were extinguished, followed by the start of the next trial. A peck on the right key following a peck on the left key resulted in 3-s access to grain, during which all the lights in the chamber were extinguished, and the grain aperture was lighted. The subsequent trial followed immediately. There were four possible two-peck response sequences:
[FIGURE 1 OMITTED]
L-R, R-L, L-L, R-R. Only the L-R sequence was reinforced with access to food.
Mastery of the L-R sequence in the presence of the guiding cues was defined as at least 80 % accuracy in each session for five consecutive sessions, with no increasing or decreasing trends in accuracy. Once mastery was reached in the Guiding-Cues condition, Pigeons 301 and 303 experienced the No-Cues condition for five sessions, and Pigeons 302 and 304 experienced the Reversed-Cues condition for five sessions. In the No-Cues condition, the cues provided by the red lights were removed; both lights were lighted white throughout a trial. The L-R response requirement remained. In the Reversed-Cues condition, the arrangement of red and white lights was reversed but the response requirement remained the same. At the start of a trial, the left key was lighted white and the right key red. After the first peck to either key, the color of the right key was changed from red to white and the color of the left key from white to red. For the No-Cues and Reversed-Cues conditions, the L-R peck sequence was reinforced with access to food and immediate presentation of the next trial. All other two-peck sequences produced a 10-s blackout. The only programmed difference was that in No-Cues, there were no discriminative stimuli to "follow" and in Reversed-Cues, pecks were required to "follow" the white key light instead of following the red key light to obtain reinforcement. Table 1 shows the order of conditions and number of sessions per condition for each pigeon.
Table 1 Condition orders for pigeons in Experiment I with number of sessions per condition in parentheses Pigeon 301 302 303 304 Red Only (2) Red Only (2) Red Only (6) Red Only (2) Guiding Guiding Cues ( 14) Guiding Cues ( Guiding Cues ( 2) 5) Cues (5) Reversed Cues (5) Red Only (4) Reversed Cues (5) Guiding Cues (5) No Cues (5)
Results Figure 2 shows the percentage of L-R response sequences and R-L response sequences during the last five sessions of the Guiding-Cues condition and all five sessions of the second condition for each pigeon. All pigeons acquired the L-R response sequence at above 80 % accuracy in the presence of the guiding cues. Accuracy for Pigeons 301 and 304 met the mastery criteria in the minimum five sessions. Accuracy for Pigeons 302 and 303 met the mastery criteria in 14 and seven sessions, respectively.
[FIGURE 2 OMITTED]
In the subsequent condition, Pigeons 301 and 303 were exposed to the No-Cues condition. Both keys were lighted white and the L-R response sequence contingency remained in effect. In the first session of the No-Cues condition, the proportion of accurate L-R response sequences dropped from above 80 % to 33 % and 3 % for Pigeons 301 and 303, respectively. Accuracy increased over five sessions for both pigeons in the No-Cues condition. In the final session, accuracy was 64 % (46 correct trials) and 56 % (40 correct trials) for Pigeons 301 and 303, respectively. Binomial sign tests on number of L-R sequences with Bonferroni-adjusted alphas to correct for multiple tests were conducted to test whether 1) accuracy was significantly greater than chance, 25 %; and 2) accuracy was significantly less than mastery, 80 %. Sign tests revealed that in the fifth session of the No-Cues condition, accuracy was significantly greater than chance (ps<0.001) for both pigeons, and not significantly less than mastery (ps> 0.99) for either. Greater-than-chance accuracy by the end of the No-Cues condition suggests that pigeons learned other discriminative stimuli (e.g., location) related to the response sequence in addition to the red-light cues during the Guiding-Cues condition.
Neither pigeon in the Reversed-Cues condition produced any correct L-R sequences in the presence of the reversed guiding cues. Pigeon 302 continued to complete all 72 trials in the five Reversed-Cues sessions. Of those trials, 15.8 % were the R-L "follow red" sequence. Pigeon 304 finished 22, 14, 22, 16 and ten trials in the five consecutive Reversed Cues sessions, respectively--an indication that responding may have been undergoing extinction in the presence of the reversed cues. Of those completed trials, 17.5 % were the R-L sequence.
To compare changes in accuracy for pigeons in the two groups, the difference between the number of accurate response sequences on the last day of the Guiding-Cues condition and the number of accurate response sequences on the fifth day of the No-Cues or Reversed-Cues condition was computed for each pigeon. An independent-samples t-test revealed that accuracy
decreased significantly more in the Reversed-Cues condition than the No-Cues condition, 42)12.73, p=0.006. The L-R sequence reemerged when discriminative stimuli were removed in the No-Cues condition, but not when they were reversed in the Reversed-Cues condition.
Pigeons' performance was similar to rats' performance reported by Reid et al. (2010) and Reid etal. (2013). Performance in comparable conditions was similar across studies and species. Accuracy in the feature-positive No-Cues condition was comparable to accuracy in Reid et al.'s (2010) feature-negative No-Lights condition. In both studies, accuracy dropped from above 80 % in the previous condition to approximately 60 % when discriminative stimuli were removed for five sessions.
When the guiding cues were reversed in the Reversed-Cues condition, Reid et al.'s (2010) rats' accuracy fell to approximately 5 %. A feature-positive discrimination bias was one potential explanation for the large decrease in accuracy: a propensity to respond to feature-positive discriminative stimuli may have prevented control over responding by practice cues. When Pigeons 302 and 304 were switched from the Guiding-Cues condition to the Reversed-Cues condition in Experiment 1, the correct L-R response sequence did not change. The signaling functions of the red and white lights switched, but the correct sequence still followed a feature-positive cue. No correct sequences were recorded for either pigeon in the Reversed-Cues condition of Experiment 1. Even under what amounted to extinction conditions for Pigeon 304, the red-light stimuli still controlled behavior, or at least prevented control by the white lights and practice cues associated with the previously reinforced L-R sequence.
Results of Experiment 1 cannot be attributed to a feature-positive discrimination bias because all stimuli in all conditions were feature-positive. The primary objective in Experiment 2 was to determine whether, after initial training with discriminative stimuli, establishing a reinforcement history without discriminative stimuli would allow behavior to adapt to the reversal in discriminative stimuli in the Reversed-Cues condition. Specifically, we hypothesized that a history of access to food as a consequence of pecking white keys in a No-Cues condition, thus creating a history in which white keys signaled reinforcement, would facilitate behavior to come under control of the white key light in the Reversed-Cues condition--causing sequence accuracy to increase in the Reversed-Cues condition. Reid et al. (2013) showed that the extent to which a change in discriminative stimuli disrupted sequence accuracy depended in large part on whether or not their rats had been exposed to certain conditions prior to the transition. Experiment 2 further explored the possibility that the relative disruptiveness of a transition is in part controlled by an organism's history with the discriminative stimuli in question.
Four pigeons were exposed to a series of Guiding-Cues, No-Cues, and Reversed-Cues conditions and replications with the expectation that creating a history in which the white lights signaled reinforcement in the No-Cues condition would facilitate sequence-accuracy maintenance/recovery when discriminative stimuli were reversed. This signal-food history hypothesis yielded two empirical predictions for Experiment 2. First, as in Experiment 1, L-R response sequences would not occur with initial exposure to a Reversed-Cues condition. However, if a history of reinforcement for pecking the white key had been established in the interim, L-R response sequences would emerge in a replication of the Reversed-Cues condition. Second, L-R response sequences would be observed during initial exposure to a Reversed-Cues condition, provided a history of reinforcement for pecking the white key in the No-Cues condition.
Subjects Four pigeons numbered 205 to 208 maintained at 85 % of their free-feeding body weight via appropriate post-session feedings served as subjects for Experiment 2. Pigeons were housed individually in a vivarium with a 12-hr light:dark cycle with continuous free access to water and intermittent access to grit. All had a prior experimental history with fixed-interval (FI) and response-initiated fixed-interval (RIFT) schedules in the presence of both red and green key lights (Fox and Kyonka 2013). Procedure All four pigeons were trained on an L-R response sequence during which red lights served as discriminative stimuli for two sessions. During this training period, only the red lights were present on the active key in the L-R sequence. A peck to the left red key extinguished it and lit the right red key. A subsequent peck to the right red key resulted in the deliveiy of reinforcement, and a .subsequent peck to the left key resulted in a 10-s blackout.
After training, all four pigeons were exposed to the Guiding-Cues condition. The white lights (S-) were added for the Guiding-Cues condition. All Guiding-Cues, No-Cues and Reversed-Cues conditions were the same as described in Experiment 1 (see Fig. 1). After initial Guiding-Cues training, Pigeons 205 and 206 were exposed to conditions in the following order: Reversed-Cues, Guiding-Cues, No-Cues, Guiding-Cues, Reversed-Cues. After initial Guiding-Cues training Pigeons 207 and 208 were exposed to conditions in the following order: No-Cues, Guiding-Cues, No-Cues, Guiding-Cues, Reversed-Cues. All Reversed-Cues and No-Cues conditions lasted five sessions. Guiding-Cues conditions lasted until accuracy was above 80 ci/o for five consecutive sessions, and there were no increasing or decreasing tends in accuracy. As in Experiment 1, the L-R sequence was reinforced and all other sequences (R-L, R-R, and L-L) resulted in a 10-s blackout in all conditions, regardless of the discriminative stimuli present. Table 2 shows the order of conditions and number of sessions per condition for each pigeon.
Table 2 Condition orders for pigeons in Experiment 2 with number of sessions per condition in parentheses 205 206 207 208 Red Only (2) Red Only (2) Red Only (2) Red Only (2) Guiding Cues Guiding Cues Guiding Cues Guiding Cues (5) (12) (6) (6) Reversed Cues Reversed Cues No Cues (5) No Cues (5) (5) (5) Guiding Cues Guiding Cues Guiding Cues Guiding Cues (5) (27) (5) (5) No Cues (5) No Cues (5) No Cues (5) No Cues (5) Guiding Cues Guiding Cues Guiding Cues Guiding Cues (5) (37) (5) (5) Reversed Cues Reversed Cues Reversed Cues Reversed Cues (5) (5) (5) (5)
Figure 3 shows the percentage of correct L-R response sequences for Pigeons 205 and 206. Both met the mastery criterion of 80 % accuracy in all Guiding-Cues conditions. Pigeon 205 required 12, 27 and 37 sessions for masteiy in the initial Guiding-Cues condition and each replication, respectively. Pigeon 206 initially met the mastery criterion in six sessions and then within the minimum five sessions for both replications. Consistent with the results of Experiment 1, accuracy fell to 0.001 % (one L-R sequence was recorded for Pigeon 205 in Session 15) in the first Reversed-Cues
[FIGURE 3 OMITTED]
Apparatus The apparatus was the same as in Experiment 1. condition. In the No-Cues condition that followed, accuracy was initially below chance for Pigeon 205 but increased in the final three sessions and was 32 % in the final session of the condition. Across the five sessions of the No-Cues condition (Sessions 17-21), accuracy ranged from 1 to 14 % for Pigeon 206. In addition, Pigeon 206 finished 72, 72, 68, 58 and 37 trials in Sessions 17 through 21, respectively--indicating that pecking was potentially undergoing extinction. In the first session of the second Reversed-Cues condition, accuracy was below chance for Pigeons 205 and 206. However, accuracy increased across the five sessions for Pigeon 205. In the fourth and fifth sessions of the Reversed-Cues condition, accuracy was 75 % and 74 %, respectively, suggesting that the history of reinforcement in the No-Cues condition was sufficient for high levels of response-sequence accuracy to emerge in the Reversed-Cues condition. Accuracy was 0 % across all five sessions in the final Reversed-Cues condition for Pigeon 206. The majority of response sequences recorded in the Reversed-Cues conditions for Pigeon 206 were R-L--an indication that the red lights were controlling responding.
For a quantitative assessment of accuracy, binomial sign tests on number of L-R sequences were conducted for the last session in both Reversed-Cues conditions and the No-Cues condition for each pigeon. Bonferroni-adjusted alphas were used to correct for multiple tests asking whether accuracy was 1) greater than chance and 2) less than mastery. Sign tests revealed that, for both pigeons, accuracy in the fifth session of the first Reversed-Cues condition and the fifth session of the No-Cues condition, was significantly less than mastery (ps< 0.001), but not significantly greater than chance (ps>0.11). Similarly, because no L-R sequences were recorded for Pigeon 206 in the second Reversed-Cues condition, accuracy was significantly less than mastery and not greater than chance. However, sign tests revealed that for Pigeon 205, accuracy in the fifth session of the second Reversed-Cues condition was significantly greater than chance (p<0.001) and not significantly less than mastery (p=0.12).
Figure 4 shows the percentage of correct L-R response sequences for Pigeons 207 and 208. The mastery criterion of 80 % accuracy in the initial Guiding-Cues condition required six sessions for Pigeon 207. Pigeon 208's accuracy was consistently below the 80 % criteria upon initial exposure to the first Guiding-Cues condition. The response sequences recorded for Pigeon 208 were predominantly L-L sequences with very brief inter-response times, likely due to key reverberation following pecks of greater than the minimum-required force. For this reason, a 0.01-s "buffer" was programmed for Pigeon 208. After any peck to any key, no pecks were recorded on that same key until 0.01 s had elapsed. After the buffer was added to the computer program, Pigeon 208 met the 80 % criteria in five sessions. In the No-Cues condition that followed, accuracy increased across the five sessions for Pigeon 207, but remained below 40 %. Accuracy in each No-Cues session was at or below 6 % for Pigeon 208. Upon re-presentation of the Guiding-Cues stimuli in the third condition, accuracy was above 80 % for both pigeons in the minimum five sessions. In the second No-Cues condition, accuracy showed no trend at a mean of 47.4 % for Pigeon 207. Overall, accuracy in the second No-Cues condition for Pigeon 208 was a mean of 9.6 %, lower than in Guiding-Cues, but higher than in the previous No-Cues. In the final session of the second No-Cues condition, accuracy was 22 % for Pigeon 208. Accuracy in the third Guiding-Cues condition was above 80 % for both pigeons for the minimum five sessions. In the sixth and final condition, which was the sole presentation of the Reversed-Cues condition for Pigeons 207 and 208, accuracy in the first session was 29 % for Pigeon 207 and I % for Pigeon 208. Accuracy for both pigeons increased across the five sessions. By the fifth session, accuracy in the Reversed-Cues condition was comparable to accuracy observed in the Guiding-Cues conditions, finishing at 92 % for both Pigeons 207 and 208.
[FIGURE 4 OMITTED]
For a quantitative assessment of accuracy, binomial sign tests on number of L-R sequences with BonfetToni-adjusted alphas were conducted for the last session in both No-Cues conditions and the Reversed-Cues condition for Pigeons 207 and 208. Sign tests revealed that, for both pigeons, accuracy in the fifth session of both No-Cues conditions were significantly less than mastery (ps<0.001). Accuracy was also significantly greater than chance for Pigeon 207 in the second No-Cues condition (p=0.001). After a history of food reinforcement as a consequence of pecking the white key had been established in the No-Cues conditions, accuracy was significantly greater than chance (ps <0.001) and not significantly less than mastery (accuracy was in fact greater than mastery) for both pigeons in the Reversed-Cues condition. The improvement in accuracy indicates that the white lights may have acquired S+ properties for both pigeons during exposure to the No-Cues conditions.
The critical between-groups comparison for Experiment 2 was whether changes in accuracy were different upon initial exposure to the Reversed-Cues condition (condition 2 for Pigeons 205 and 206; condition 6 for Pigeons 207 and 208). The difference between the number of accurate response sequences on the last day of the preceding Guiding-Cues condition and the number of accurate response sequences on the fifth day of the Reversed-Cues condition was computed for each pigeon. An independent-samples t-test revealed that accuracy decreased significantly more for Pigeons 205 and 206 than it did for Pigeons 207 & 208, t(2)=11.57, p=0.007. Evidently, a histoiy of reinforcement for the L-R sequence in the presence of white lights on both keys made it possible for the response sequence to come under the control of the white keys as discriminative stimuli in the Reversed-Cues condition, even though they were presented concurrently with previous S+ stimuli (the red lights).
All of the non-L-R response sequences in all Guiding-Cues conditions were L-L sequences. Although pecking the key with enough force to register a response produced auditory feedback and (in Guiding-Cues and Reversed-Cues conditions) a visual stimulus change, it is possible that pigeons did not always discriminate that the first peck was registered. Interestingly, when the L-R sequence did not occur in Reversed-Cues conditions, no other sequence occurred reliably across pigeons and the R-L (follow-red) sequence only occurred reliably for one pigeon--Pigeon 206.
Accuracy in the final Reversed-Cues session of Experiment 2 was almost as high for Pigeon 205 as it was for Pigeons 207 and 208, which suggests that initial exposure to the No-Cues condition immediately prior to the Reversed-Cues condition was not necessary for recovery of the L-R sequence. However, it is not clear what stimuli were controlling behavior in the final Reversed-Cues condition for those three pigeons. Stimuli unrelated to the key lights, such as "practice cues," may have acquired control over behavior once the L-R sequence had been reinforced without discriminative stimuli: The red lights no longer overshadowed control by these other stimuli in the Reversed-Cues condition. It is also possible that the white lights in the Reversed-Cues condition may have taken on the same S+ function as the red lights in the Guiding-Cues condition after pecking white keys was reinforced in the No-Cues condition.
Once the L-R sequence had been reinforced in the absence of discriminative stimuli, the sequence was maintained or recovered independently of discriminative stimuli. Nevertheless, there was some evidence in Experiment 2 that key-light discriminative stimuli continued to exert stimulus control over response sequences. For three out of four pigeons, percent accuracy in the final session of the last Reversed-Cues condition was comparable to accuracy in Guiding Cues conditions, but accuracy in No-Cues conditions was always lower. Both Guiding Cues and Reversed Cues were effective discriminative stimuli. It is possible that after additional experience switching back and forth between the Guiding-Cues and Reversed-Cues conditions, accurate sequences might be maintained at above 80 % and not initially decline following a transition. Reid et al. (2014) also observed maintained or improved accuracy during repeated shifts between conditions in pigeons completing L-R response sequences.
The additional experience in the No-Cues replication prior to any exposure to a Reversed-Cues condition apparently facilitated L-R sequences in the final Reversed-Cues condition for Pigeons 207 and 208. The fact that accuracy in the fifth session of the Reversed-Cues replication was near mastery for Pigeon 205 suggests, however, that the specific number and presentation order of conditions was not strictly necessary. Pigeon 205 experienced the No-Cues condition only once and finished at 74 % accuracy in the final Reversed-Cues session. Pigeons 207 and 208 experienced the No-Cues condition for five more sessions than Pigeons 205 and 206, and finished the final Reversed-Cues condition at 92 % accuracy. Additional experience for Pigeon 206 in the No-Cues condition would not necessarily have improved performance in the Reversed-Cues condition, because responding appeared to be undergoing extinction in the No-Cues condition. Alternatively, the disruption of the L-R sequence in the first Reversed-Cues condition may have interfered with any facilitative effects of the No-Cues condition for Pigeons 205 and 206.
Another possible contributor to the re-emergence of the LR sequence in the final Reversed-Cues condition for Pigeon 205 is increased experience in the Guiding-Cues condition. The mastery criterion of greater than 80 % accuracy in the Guiding-Cues condition was met in the minimum five sessions for Pigeons 206, 207 and 208 after the first Guiding-Cues condition. Pigeons 206, 207 and 208 spent a total of 16, 16 and 15 sessions in the Guiding-Cues condition, respectively (after programming the buffer for Pigeon 208). Pigeon 205 spent 76 sessions in the Guiding-Cues condition because of a failure of accuracy to meet the mastery criterion. This additional experience performing the sequence may have been responsible for increased accuracy in the last Reversed-Cues condition for Pigeon 205 compared to Pigeon 206. However, Pigeons 207 and 208 experienced the Guiding-Cues condition an equal or lesser number of times compared to Pigeon 206, suggesting that it was the experience in the No-Cues condition that facilitated higher levels of accuracy in the final Reversed-Cues condition and not more experience in the Guiding-Cues condition.
Pigeons' performance in the final Reversed-Cues condition of Experiment 2 suggests that either the red or white lights could function as discriminative stimuli to "guide" the L-R response sequence. Higher accuracy and therefore more reinforcers delivered in the presence of just the white lights in the No-Cues condition prior to the final Reversed-Cues condition was correlated with higher accuracy in the final Reversed-Cues condition. Pigeon 206 was the only pigeon that never produced the L-R sequence in the final Reversed-Cues condition after experiencing the No-Cues condition. Pigeon 206 also earned the fewest reinforcers in the No-Cues condition and the two white keys may have signaled extinction, or at least a very low probability of reinforcement. Furthermore, for the three pigeons in which the L-R response sequence emerged at levels near or above mastery in the final Reversed-Cues condition, response-sequence accuracy increased across the five sessions in the No-Cues condition that preceded it. Together, these findings suggest that accurate predictions of how changes in discriminative stimuli will affect response-sequence accuracy can be made based on an inspection of the organism's behavioral history with the discriminative stimuli in question. General Discussion
In Experiment 1, pigeon response-sequence accuracy increased and decreased in the same way as previously reported in rats (Reid et al. 2010) when discriminative stimuli signaling correct responses changed. In the No-Cues condition, the discriminative function of the key lights was removed by lighting both side keys white and the percent of correct sequences decreased for Pigeons 301 and 303. In the Reversed-Cues condition, the discriminative functions of the red and white lights were reversed; the correct L-R sequence required pecking the white keys. Correct L-R sequences never occurred in the Reversed-Cues condition. In Experiment 2, for three out of four pigeons, exposure to the No-Cues condition prior to the Reversed-Cues condition effectively enabled recovery of the L-R sequence in the Reversed-Cues condition.
Neither a feature-positive discrimination bias nor the S-R compatibility theory can explain results of all guiding-cues experiments published to date. The S-R compatibility theory cannot account for the results from Reid et al.'s (2013) Experiment 3, and the feature-positive bias effect cannot account for the results from Experiment 1 or the replication in the first two conditions of Experiment 2 described above. Reid etal. (2013) concluded that behavioral history may play an important role in how changes in discriminative stimuli affect behavior. The results of Experiment 2 show that an account of behavioral histories can in part explain why response-sequence accuracy may or may not be disrupted--specifically, the results of Experiment 2 suggest that a previous history with a stimulus as an S+ is sufficient for it to function as an effective "guiding cue" in the future, even if the organism has a history with that same stimulus as an S-in other contexts. So far, a behavioral history explanation of guided-skill learning and disruption caused by changes in discriminative stimuli is more parsimonious and predictive than the other two explanations.
The fact that exposure to previous experimental conditions profoundly impacted the disruptiveness of transitions between conditions is not new. It is more broadly known as "path dependence" in associative learning (e.g., Brown-Su et al. 1986; Miller et al. 1995). Both Experiment 2 and Reid et al.'s (2013) Experiment 4 illustrate the importance of path dependence in guided skill learning experiments. In Experiment 2, the notion of path dependence is illustrated by the relative disruptiveness of the Guiding-Cue [right arrow] Reversed-Cue transition. If the transition was preceded by exposure to the No-Cues condition, it was relatively undisruptive for three of four pigeons: by the end of five sessions, response-sequence accuracy was at or near levels observed in the Guiding-Cues conditions. However, if the transition was not preceded by the No-Cues condition, it resulted in profound disruption of response-sequence accuracy. In the context of operant conditioning and guided-skill learning, major questions remain. What factors contribute to the influence that experience has on current transitions? How much experience is necessary and what specific conditions must be present in the learning history for certain transitions to become less disruptive in the future? How do "environmental cues" (e.g., key lights) and "practice cues" interact or compete for control of a response sequence, and how do transitions affect control by these two potential sources? These are not easy questions to answer, but the "guiding-cues" framework has proven useful in starting to analyze some of the relevant variables.
Future research using this experimental preparation should include a stimulus change in the No-Cues conditions when a peck is recorded--for example, the lights could blink when a response is recorded. The lack of a visual change in these conditions may produce more L-L or R-R sequences because the "registering" of a response is not signaled, although responses in Experiments 1 and 2 did produce an audible "click" when the key micro-switch was closed.
The behavioral framework used here and by Reid and colleagues (2010, 2013, 2014) provides a useful tool for investigating the variables that affect motor-skill learning and for studying path dependence in operant conditioning. The behavioral-histories explanation provided here for why discriminative stimuli that are highly predictive of reinforcement don't control behavior is predictive and testable. Cognitive models and descriptions of motor-skill learning and techniques for improving motor-skill acquisition and maintenance (e.g., Schmidt 1975) are largely descriptive and do not provide predictions for how skills are maintained under changing conditions (Gluck et al. 2008; Reid et al. 2010). Future research might also test human performance using the current framework, thus potentially extending the framework's scope of potential applications. This motor-skill learning paradigm might also be useful for testing effects of drugs on motor-skill acquisition, maintenance, and resistance to change.
Acknowledgement The authors thank Nathan Rice for assistance in computer programming.
References Bacha-Mendez, G., Reid, A. K., & Mendoza-Soylovna, A. (2007). Resurgence of integrated behavioral units. Journal of the Experimental Analysis of Behavior; 87, 5-24.
Brown-Su, A. M., Matzel, L. D., Gordon, E. L., & Miller, R. R. (1986). Malleability of conditioned associations: path dependence. Journal of Experimental Psychology: Animal Behavior Processes, 12,420427.
Fitts, P. M., & Seeger, C. M. (1953). S-R compatibility: spatial characteristics of stimulus and response codes. Journal of Experimental Psychology, 46, 199-210.
Fox, A E., & Kyonka, E. G. E. (2013). Pigeon responding on fixed-interval and response-initiated fixed-interval schedules. Journal of the Experimental Analysis of Behavior, 100, 187-197.
Gluck, M. A., Mercado, E., & Myers, C. E. (2008). Learning and memory: From brain to behavior. New York: Worth.
Hearst, E. (1991). Psychology and nothing. American Scientist, 79, 432443.
Hommel, B. (1995). Stimulus-response compatibility and the Simon effect: toward an empirical clarification. Journal of Experimental Psychology: Human Perception and Performance, 21, 764-775.
Hommel, B. (2011). The Simon effect as tool and heuristic. Acta Psychologica, 136, 189-202.
Jenkins, H. M., & Sainsbury, R. S. (1969). The development of stimulus control through differential reinforcement. In N. J. Mackintosh & W. K. Honig (Eds.), Fundamental issues in associative learning (pp. 123-161). Halifax: Dalhousie University Press.
Jenkins, H. M., & Sainsbury, R. S. (1970). Discrimination learning with the distinctive feature on positive or negative trials. In D. Mostofsky (Ed.), Attention: Contemporary theory and analysis (pp. 239-273). New York: Appleton-Century-Crofts.
Kiernan, D., Ray, M., & Welsh, T. N. (2012). Inverting the joint Simon effect by intention. Psychonomic Bulletin and Review, 19, 914-920.
Kyonka, E. G. E., & Grace, R. C. (2007). Rapid acquisition of choice and timing in pigeons. Journal of Experimental Psychology: Animal Behavior Processes, 33, 392-408.
Kyonka, E. G. E., & Grace, R. C. (2008). Rapid acquisition of preference in concurrent chains when alternatives differ on multiple dimension of reinforcement. Journal of the Experimental Analysis qf Behavior. 89, 49-69.
Latta!, K. A. (1975). Reinforcement contingencies as discriminative stimuli. Journal of the Experimental Analysis of Behavior, 23, 241-246.
Lotz, A., Uengoer, M., Koenig, S., Pearce, J. M., & Lachnit, H. (2012). An exploration of the feature-positive effect in adult humans. Learning & Behavior, 40, 222-230.
Miller, R. R., Barnet, R. C., & Grahame, N. J. (1995). Assessment of the Rescorla-Wagner model. Psychological Bulletin, 17, 363-386.
Nallan, G. B., Miller, J. S., McCoy, D. F., Taylor, R. T., & Serwatka, J. (1984). Transfer effects in feature-positive and feature-negative learning by pigeons. American Journal qfPsychology, 97, 509-518.
Proctor, R. W., & Reeve, T. G. (1990). Stimulus-response compatibility: An integrated perspective. Amsterdam: North-Holland.
Reed, P., Schachtman, T. R., & Hall, G. (1991). Effect of signaled reinforcement on the formation of behavioral units. Journal of Experimental Psychology: Animal Behavior Processes, 17, 147-162.
Reid, A. K., Chadwick, C. Z., Dunham, M., & Miller, A. (2001). The development of functional response units: the role of demarcating stimuli. Journal of the Experimental Analysis of Behavior, 76, 303-320.
Reid, A. K., Nil, C. A., & Getz, B. R. (2010). Changes in stimulus control during guided skill learning in rats. Behavioural Processes, 84, 511-515.
Reid, A. K., Rapport, H. F., & Le, T. (2013). Why don't guiding cues always guide in behavior chains? Learning & Behavior, 41, 402-413.
Reid, A.K., Folks, N., & Hardy, J. (2014). On the dynamics of stimulus control during guided skill learning in nonhumans. Behavioural Processes. doi:10.1016/j.beproc.2014.01.017.
Sainsbury, R. S. (1973). Discrimination learning utilizing positive or negative cues. Canadian Journal of Psychology, 27, 46-57. Schmidt, R. A. (1975). A schema theory of discrete motor skill learning.
Psychological Review, 82, 225-260.
Schneider, S. M., & Davison, M. (2005). Demarcated response sequences and the generalized matching law. Behavioural Processes, 70, 51-61.
Schneider, S. M., & Morris, E. K. (1992). Sequences of spaced responses: behavioral units and the role of contiguity. Journal of the Experimental Analysis qf Behavior, 58, 537-555.
Shimp, C. P. (1981). The local organization of behavior: discrimination of and memory for simple behavior patterns. Journal of the Experimental Analysis of Behavior. 36, 303-315.
Shimp, C. P. (1982). On metaknowledge in the pigeon: an organism's knowledge about its own behavior. Animal Learning and Behavior, 10, 358-364.
Simon, J. R. (1969). Reactions toward the source of stimulation. Journal of Experimental Psychology, 81, 174-176.
Urcuioli, P. J., Vu, K.-P. L., & Proctor, R. W. (2005). A Simon effect in pigeons. Journal of Experimental Psychology: General, 134, 93-107.
A. E. Fox * E. G. E. Kyonka
Department of Psychology, West Virginia University, Morgantown, WV 26506, USA e-mail: email@example.com e-mail: firstname.lastname@example.org
A. K. Reid Department of Psychology, Wofford College, Spartanburg, SC, USA e-mail: email@example.com
Present Address: A. E. Fox Department of Psychology, St. Lawrence University, Canton, NY, USA
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||ORIGINAL ARTICLE|
|Author:||Fox, Adam E.; Reid, Alliston K.; Kyonka, Elizabeth G. E.|
|Publication:||The Psychological Record|
|Date:||Sep 1, 2014|
|Previous Article:||The role of correspondence training on children's self-report accuracy across tasks.|
|Next Article:||Using mand training to increase vocalization rates in infants.|