COVID-19   Law    Advocacy    Topics A-Z     Training    Wrights' Blog   Wrightslaw Store    Yellow Pages for Kids 

 Home > Main Law Library > Articles & Reports > Back To School On Civil Rights Part III


The Special Ed Advocate newsletter
It's Unique ... and Free!

Enter your email address below:

2024-2025
Training Programs

Nov. 16 - WV via ZOOM

2025

Mar. 18-19 - VA via ZOOM

Sept. 18 - MD via ZOOM

Full Schedule


Wrightslaw

Home
Topics from A-Z
Free Newsletter
Seminars & Training
Yellow Pages for Kids
Press Room
FAQs
Sitemap

Books & Training

Wrightslaw Storesecure store lock
  Advocate's Store
  Student Bookstore
  Exam Copies
Training Center
Mail & Fax Orders

Advocacy Library

Articles
Cool Tools
Doing Your Homework
Ask the Advocate
FAQs
Newsletter Archives
Short Course Series
Success Stories
Tips

Law Library

Articles
Caselaw
Fed Court Complaints
IDEA 2004
McKinney-Vento Homeless
FERPA
Section 504

Topics

Advocacy
ADD/ADHD
Allergy/Anaphylaxis
American Indian
Assistive Technology
Autism Spectrum
Behavior & Discipline
Bullying
College/Continuing Ed
Damages
Discrimination
Due Process
Early Intervention
  (Part C)

Eligibility
Episodic, such as
   Allergies, Asthma,
   Diabetes, Epilepsy, etc

ESSA
ESY
Evaluations
FAPE
Flyers
Future Planning
Harassment
High-Stakes Tests
Homeless Children
IDEA 2004
Identification & Child Find
IEPs
Juvenile Justice
Law School & Clinics
Letters & Paper Trails
LRE / Inclusion
Mediation
Military / DOD
Parental Protections
PE and Adapted PE
Privacy & Records
Procedural Safeguards
Progress Monitoring
Reading
Related Services
Research Based
  Instruction

Response to Intervention
  (RTI)

Restraints / Seclusion
   and Abuse

Retention
Retaliation
School Report Cards
Section 504
Self-Advocacy
Teachers & Principals
Transition
Twice Exceptional (2e)
VA Special Education

Resources & Directories

Advocate's Bookstore
Advocacy Resources
Directories
  Disability Groups
  International
  State DOEs
  State PTIs
Free Flyers
Free Pubs
Free Newsletters
Legal & Advocacy
Glossaries
   Legal Terms
   Assessment Terms
Best School Websites

 
Back to School on Civil Rights

III. Grant Administration, Compliance Monitoring, Complaint Handling, and Enforcement Functions

A. Grant Administration
B. Oversight: Federal Monitoring of States
C. Oversight: Complaint Handling
D. Enforcement

The legal authority for the Department of Education (DoED) to ensure compliance with the Individuals with Disabilities Education Act (IDEA) is found in provisions of the statute itself that authorize assessment of policy and procedure documents to determine state eligibility for funding,[85] referral of a state to the Department of Justice, and withholding funds when a state has failed to comply substantially with any provision of Part B of IDEA.[86]

The key activities that the Office of Special Education Programs (OSEP) carries out in relation to monitoring state compliance with the law are (1) determining state eligibility for federal grants under IDEA, (2) conducting on-site monitoring visits and issuing monitoring reports, (3) developing corrective action plans and overseeing the implementation of those corrective actions ordered by OSEP, and (4) initiating enforcement action. This part discusses these core federal functions of IDEA implementation oversight.

A. Grant Administration
1. The Basic State Grant Program
IDEA '97 requires the states to submit applications that ensure "to the satisfaction of the Secretary" that they have policies and procedures that meet the conditions of federal law.[87] These conditions include (1) access to a free appropriate public education (FAPE), (2) individualized education programs (IEP), (3) least restrictive environment (LRE), (4) procedural safeguards, (5) evaluations, (6) general supervision by the state education agency (SEA), (7) a comprehensive system of personnel development, (8) personnel standards, (9) performance goals and indicators, and (10) participation in assessments.[88] 

Before the enactment of IDEA '97, a state plan was submitted to OSEP every three years to determine eligibility. States were required to submit assurances that they were complying with the various requirements during the three-year interim period. IDEA '97 no longer specifically requires a state plan, and one submission of policies and procedures information, if accepted, remains in effect indefinitely. Modification of a state eligibility document may be required if (1) the state determines that a modification is required, perhaps because of changes in state law or regulations; (2) there is a change in IDEA by amendment or a new interpretation of IDEA by a federal court or a state's highest court; or (3) there is an official finding of noncompliance with federal law or regulations. When the Federal Government requires a modification of the application, it need only be to the extent necessary to ensure the states' compliance with the part of the law that is newly amended, interpreted, or out of compliance, not the entire law or larger portions of the law.[89]

For FY 1997, OSEP did not require states to submit a detailed application, as the reauthorization of IDEA was imminent and significant changes in the law were anticipated. OSEP thought it would be prudent to wait until the new law was enacted. The reauthorization was not complete until June 1997, and the regulations to implement the new law were not finalized until March 12, 1999. Thus, since 1997, OSEP has allowed states to receive their funding by signing assurances that they would comply with existing federal law. In 1997, after the law was reauthorized, OSEP sent all states a packet explaining the requirements of IDEA '97. Beginning in 1998, OSEP gave states the option of submitting an application or signing a statement of assurances. One state, Wisconsin, submitted an application, which was approved. All of the other states have signed and submitted assurance statements to OSEP for fiscal years '97-'98, '98-'99 and '99-'00.[90]

OSEP generally notified the states of information that would be due about three months prior to the actual due date. Every state had to allow a 60-day public review period for the eligibility documents prior to submitting them to OSEP. States could publish notices of availability in newspapers, distribute them in libraries, etc. The due date to the Federal Government was generally April 1 or May 1. OSEP took two to three months to review the documents and generally awarded funds by July 1 of the same year.

States submitted an original and two copies of their documents to the Monitoring and State Improvement Planning Division (MSIP). MSIP staff logged them in, keeping one copy in a central file and giving copies to two readers, a primary and a secondary reader. The primary reader was generally the person assigned to that state as the "state contact" for monitoring, technical assistance, etc. This person was to be familiar with any monitoring issues in that state. Both readers read the documents with a checklist to determine if the required elements were present. The readers met with the team leader and discussed the documents. The team could choose to coordinate its review with other divisions in DoED and provide the state technical assistance if needed to amend the application. If there were significant problems with the application, the Office of General Counsel (OGC) could become involved. If the team agreed to recommend approval, the application was eventually approved by the director of OSEP, and an award was sent to the state. If the team did not recommend approval, the state was given reasonable notice and an opportunity for a hearing in accordance with the statute before the Secretary of Education made a final determination of ineligibility.[91]

In the past, OSEP may have given "full" or "conditional" approval of the state plan. Full approval implied that the state had satisfied the Department of Education that the necessary policies and procedures to carry out IDEA were in place. Conditional approval indicated that, while a policy or procedure was not in compliance with IDEA, the state had assured that the practice of the state was in compliance. For example, a state may have needed to change a state law to come into compliance; however, such a change may not have been possible for more than a year, since the legislature meets only every other year. OSEP would have provided conditional approval to such a state after it assured DoED that it was following the federal law and working to change the state law. Both conditional and full approval provided for full funding to the state.

As Table 2 indicates, states frequently received conditional approval of their plans. However, in the last year during which plans were submitted to OSEP, '95-'96, fewer conditional plans and more fully approved plans were in evidence. For FY '93-'94, the status of plans was as follows: 31 plans were fully approved and 27 were conditionally approved. For FY '94-'95, 43 plans were fully approved and 15 were conditionally approved. For FY '95-'96, 46 plans were fully approved, 10 were conditionally approved, and 2 received a "not applicable" ranking.[92] The percentage of fully approved state plans rose from 53 percent in FY '93-'94 to 74 percent in FY '94-'95 to 82 percent in FY '95-'96.

Table 2: Status of Approval of IDEA Part B State Plans/State Plan Reviews
States 95-96 94-95 93-94
Alabama F C F
Alaska F F F
American Samoa F F F
Arizona F F F
Arkansas F F C
California C C C
Colorado F F F
Connecticut F F F
Delaware F F C
District of Columbia C C C
Florida F F F
Georgia F F C
Guam F F C
Hawaii F C C
Idaho F F F
Illinois F F C
Indiana F F F
Iowa F F F
Kansas F F C
Kentucky F F C
Louisiana F F C
Maine C C C
Maryland F F C
Massachusetts F F C
Michigan C C C
Minnesota F C C
Mississippi F F F
Missouri F F F
Montana F F F
Nebraska C C F
Nevada F C C
New Hampshire F F C
New Jersey F C C
New Mexico F F F
New York F F F
North Carolina F F F
North Dakota F F F
Northern Mariana Islands F F F
Ohio C C C
Oklahoma F F F
Oregon F F F
Pennsylvania F F C
Puerto Rico F F F
Rhode Island F F F
South Carolina F F C
South Dakota F F F
Tennessee C C F
Texas F C C
Utah F F F
Vermont F F C
Virgin Islands Consolidated
Virginia C F C
Washington F F F
West Virginia F F F
Wisconsin C F F
Wyoming F F F
Marshall Islands NA F F
Federated States of Micronesia NA C C
Republic of Palau C C C

C = Conditional Approval, F = Full Approval, NA = not applicable due to changing legal status.

The reasons for the increase in states being fully approved are not readily apparent. An inquiry and analysis beyond the scope of this study may provide an explanation for this shift.

2. Competitive State Program Improvement Grants
The 1997 IDEA amendments included a new discretionary program titled State Program Improvement Grants for Children with Disabilities.[93] The purpose of these grants is to assist states, in partnership with a range of stakeholders in the states, in reforming and improving their systems that serve students with disabilities. Congress appropriated $35.2 million for these grants in FY '99. The grants will be awarded to states on a competitive basis, in the range of $500,000 to $2 million per year. The first awards were made in January 1999. Seventy-five percent of the funding received under these grants must go for personnel preparation.[94]

The statute outlines the analyses the state must conduct in developing a state improvement plan. That analysis must include the major findings of the most recent federal reviews of state compliance as they relate to improving results for children with disabilities.[95] The law also requires that the state improvement plan include improvement strategies, one of which must address systemic problems identified in federal compliance reviews.[96]

Although it is not yet clear how competitive state grants will affect state compliance with IDEA, they are intended to create an incentive toward the systemic changes a state must implement to achieve full compliance with IDEA.

3. Findings and Recommendations

Finding # III A.1
Many states are found eligible for full funding under Part B of IDEA while simultaneously failing to ensure compliance with the law.
Though no state is fully ensuring compliance with IDEA, states usually receive full funding every fiscal year. Once eligible for funding, a state receives regular increases, which are automatic under the formula. OSEP's findings of state noncompliance with IDEA requirements usually have no effect on that state's eligibility for funding unless (1) the state's policies or procedures create systemic obstacles to implementing IDEA, or (2) persistent noncompliance leads OSEP to enforce by imposing high risk status with "special conditions" to be met for continued funding.

Recommendation # III A.1
The Department of Education should link a state's continued eligibility for federal funding under Part B to the remedy of any noncompliance within the agreed upon time frame.
When a state is found out of compliance with the law via federal monitoring, continued eligibility for IDEA funding should be linked with achieving compliance within a designated time frame. The state corrective action plan or compliance agreement should spell out what must be done within a specific time frame to achieve compliance or the state will be found ineligible for all or part of the available grant money for the next fiscal period.

Finding # III A.2
The competitive State Program Improvement Grants are intended to make funding available to states for implementing improvement strategies to correct IDEA noncompliance problems.

Recommendation # III A.2A
OSEP should require that five percent of funds awarded under the State Program Improvement Grants be applied toward developing a statewide standardized data collection and reporting system for tracking the core data elements needed to measure state compliance with IDEA and evaluate educational results for children with disabilities.

Recommendation # III A.2B
When a state is found out of compliance with the law via federal monitoring, continued eligibility for State Program Improvement Grant funding should be linked with achieving compliance within a designated time frame.

B. Oversight: Federal Monitoring of States
1. Purpose of Monitoring
States are regularly monitored by OSEP. Such monitoring includes on-site visits, data collection and analysis, and the issuance of an official report. This basic monitoring process has undergone periodic changes since the enactment of IDEA. As noted in the review of annual reports below, the purpose of monitoring has shifted over the years depending on the context in which it was carried out. The law states that the Federal Government's role is one of monitoring the states to ensure their implementation of the law. Indeed, much of the responsibility for compliance lies with the states in their responsibility to monitor the local education agencies (LEAs). The Federal Government has increasingly looked to the states to take on this role and gradually redefined its role as one of partnership with the states. In fact, the IDEA amendments of 1997 strengthen the expectation that the states will monitor the LEAs. The statute holds that states are expected to reduce or withhold payments to LEAs if they are found to be out of compliance with the law.[97] For the first time, in 1998, the Federal Government took enforcement action against a state for not taking effective enforcement action against an LEA found to be out of compliance (see discussion of Pennsylvania as a high risk grantee).

OSEP claims its approach to monitoring has had significant positive impacts on compliance in a number of states. For example, the state educational agency (SEA) in some states has taken action to correct deficient practices identified by OSEP during the monitoring review, even before the state has received OSEP's report. In such instances, the states' solutions have often incorporated technical assistance provided by OSEP during the monitoring visits. According to OSEP, a number of states also have made positive changes, at least in part because of the emphases and findings of OSEP monitoring, in two important areas: (1) state monitoring and complaint resolution procedures, and (2) the movement of many children with disabilities from separate settings into less restrictive placement options.[98]

OSEP currently describes its monitoring as shifting from being procedurally oriented to being results oriented.[99] The purpose of monitoring as defined by OSEP today is to improve results for children with disabilities.[100] As mentioned earlier, OSEP has redesigned its monitoring process (see Appendix H) to be a component of what it calls a "state review and improvement process" where the state is a collaborator with the Federal Government and other constituencies to assess the educational success of students with disabilities and to design and implement steps for improvement.[101] There appears to be a shift away from monitoring used solely as a tool for obtaining compliance toward monitoring used as a tool for both program improvement and compliance.

2. The Decision About What to Monitor
OSEP is responsible for ensuring that states are in compliance with IDEA. The requirements of IDEA are numerous and not every requirement is monitored in every state on every monitoring visit. Neither are the same requirements monitored for the same state over time. However, as the analysis below of the most recent monitoring reports (1994-1998) indicates, there does appear to be a relatively stable set of requirements that are monitored. The decision about exactly what to monitor in a state during a particular monitoring visit appears to be determined by the team doing the monitoring based on their analysis of the information they collect about the state.

A 1995 memo from Thomas Hehir, director of OSEP, to Chief State School Officers indicates that monitoring and corrective action plans will be focusing on requirements that have the most direct relationship to student results. These requirements are identified as (1) access to the full range of programs and services available to nondisabled children, including regular and vocational education programs and curricula and work-experience programs; (2) individualized education programs, including statements of needed transition services for students age 16 and younger, if necessary; (3) education of students with disabilities in the regular education environment and the availability of a continuum of alternative placements; and (4) state systems for general supervision including complaint management and due process hearing systems.[102]

3. The Monitoring Cycle
For 1997-1998, OSEP conducted implementation planning visits in lieu of monitoring visits. The purpose of these visits was to provide technical assistance to states on the requirements of the new law. OSEP began monitoring with the new continuous improvement monitoring process in the fall of 1998. Before IDEA '97, states were on a four-year monitoring cycle. Every year 12 to 15 states were monitored.[103] The monitoring cycle described and the monitoring reports analyzed below predate the changes OSEP implemented in the fall of 1998.

4. The Monitoring Process Before the Fall of 1998
The monitoring process took place in four phases: pre-site activities, the on-site visit, the issuance of the report, and the corrective action plan.

a. Pre-Site Activities
Approximately three to six months before an on-site visit, OSEP took the following steps: (1) scheduled public meetings and on-site visit dates with the state, (2) informed interested parties of the meeting dates and sites, (3) requested documents from the state for review, (3) held public and outreach meetings in the state to gain input, (4) determined issues to be reviewed and established a schedule for interviews with the SEA, (5) selected agencies and schools/programs to be visited, (6) contacted local sites, (7) established schedules, and (8) requested documents. Monitoring staff were usually in the state for about one week for the pre-site activities.

Beginning in 1994, OSEP began conducting outreach meetings in addition to public meetings, which were open forums. These meetings were by invitation only and included disability leaders in the state, representatives of the Parent Training and Information (PTI) centers and the Protection and Advocacy (P&A) systems. Generally about 12-20 disability leaders from the state attended the meetings.

Attendance at the public meetings ranged from five to 200. Between one and six public meetings were held in different geographic locations in a state, at different times of the day. SEA mailing lists, and sometimes lists from PTIs or other advocacy groups, were used to send "interested party" invitations to the meetings.

After the pre-site activities, in preparation for the site visit, the monitoring staff analyzed the information collected in the state and gathered and considered additional relevant information obtained from (1) complaints received by OSEP about the state and its policy and procedures, and (2) contacts with the Office for Civil Rights (OCR), the Rehabilitation Services Administration (RSA), and advocacy groups within the state. All of this information was used to determine what issues were to be examined and where the on-site visits were to take place.

b. The On-Site Visit
The on-site visit usually lasted a week and took place about five to six weeks after the pre-site activities. Six to ten people made up the monitoring team. The on-site visit involved meeting with officials of the SEA and visiting LEAs, including schools. The monitoring team used the information gathered from the pre-site activities to determine which LEAs to visit. It considered when the state last monitored the LEA, and chose some LEAs that had been recently monitored by the state and some that had not been monitored for a long period of time. It looked at the results of the SEA monitoring and compared them to its own results. If the team saw differences that hadn't been corrected, it knew the states were not enforcing the corrections. If it found deficiencies that the state monitoring had not found, there was an indication that the state monitoring system was not effective in identifying deficiencies.[104]

In smaller states, the monitoring teams usually visited four or five LEAs. In larger states, the teams visited eight to 10 LEAs. The LEAs were notified by the SEA two to three weeks in advance that the monitoring team would be visiting. The team tried to have geographic diversity in its visits and took special populations into consideration. It looked at LEA data regarding placements in separate settings, personnel, related services, etc. The data may have revealed problems in the LEA that the team may have pursued while visiting there. The team tried to visit elementary schools, middle schools, and high schools. It met with administrators, looked at student records, and interviewed teachers. It did not observe students or compare the students' records to the students' experience.[105]

The team members in the field talked with the team members at the SEA to discuss data collection and potential findings. An exit conference was held with the SEA to present the preliminary findings.[106]

c. The Monitoring Report
The monitoring team returned to Washington, DC, and worked together to analyze the data they had collected and the results of the monitoring visit. The team might call the state back to request clarification or additional information. The report was developed and reviewed by the team leader, the division director, the director of OSEP, and the OGC. The report was cleared and issued to the Chief State School Officer with a copy sent to the director of special education in the state.

The intended time line for the issuance of the report was 150 to 180 days after the on-site visit.[107] Analysis of the most recent monitoring reports for each state revealed that the time elapsed between the monitoring visit and the final report was greater than 90 days for 45 states, greater than 180 days for 27 states, and greater than 365 days for 12 states.

In the past, OSEP issued draft reports to the states, and the states could then respond and defend their response. OSEP would consider their response and might make changes in the report based on that response. OSEP eliminated this practice with the 1994-95 monitoring cycle. It began issuing only the final report. The state had 15 calendar days from the date it received the report to submit a letter to OSEP documenting findings in the report that were without legal or factual support. If OSEP determined that it was necessary to delete or revise a finding, a letter setting forth the deletion or revision was appended as part of the report.[108]

d. Corrective Action Plans
In every monitoring report that documented findings of noncompliance (which were all monitoring reports), parameters for a corrective action plan (CAP) were set forth. OSEP was available to work with the state to develop the plan. The plan was to be submitted to OSEP within 45 days of receipt of the report. If the state did not submit a plan, OSEP unilaterally would develop the CAP for the state.[109] (OSEP reported that to its knowledge this circumstance never occurred.)[110]

The time line for completing a corrective action plan ranged from one to three years, with the average being two years. The deadline depended on the nature of the deficiency, as correction for some might take significantly more time than for others.

Follow-up visits might be conducted to determine the implementation of the CAP. For some states, submission of documentation might be the follow-up. Generally, OSEP reported that it conducted four to six follow-up visits per year to assess CAP implementation.

Generally, follow-up visits were similar to mini on-site visits. The follow-up team comprised two to three people who visited the state office for about two days and LEAs for about two days. If OSEP determined that the corrective action plan had been implemented and was effective, it closed out the plan. In situations where OSEP found little or no change, it scheduled another follow-up visit. In two situations (Pennsylvania and New Jersey) where the second follow-up visit found continued noncompliance, the states were designated as high risk grantees (see earlier discussion).

e. OSEP's Maintenance of Monitoring Reports and Records Regarding Monitoring Reports
OSEP's policy was to keep monitoring records related to IDEA for three to five years.[111] Thus, OSEP appeared to have very few monitoring reports more than five years old, nor did they have an inventory listing that reported which ones they possessed and which ones they did not. This study initially requested a complete set of reports for 11 states, going back in time as far as DoED had records. Because of the limited availability of reports, this request was modified to include only six states. For one state, Illinois, the oldest report DoED had was from 1991. For other states, some reports were missing (for example, while DoED had the 1983 report from New York, it did not have the 1987 report). There was no chronology of monitoring over time in OSEP.

5. Analysis of Fifty Federal Monitoring Reports
Little research on state compliance with special education requirements over time has been conducted. NCD was aware of only one study that had examined compliance trends. That study, released in1993 by the National Council on Disability, disaggregated OSEP state monitoring data collected from April 1989 to February 1992 to the school district level. The study revealed very high levels of school district noncompliance as noted in Table 3 below.[112]

Table 3: State Monitoring Data (Reprint from NCD Study)
Requirement Districts Monitored Districts in Noncompliance Percentage in Noncompliance
IEP 165 150 90.9%
LRE 165 143 86.7%
Procedural Safeguards 165 152 92.1%

Note: IEP = Individualized Education Program; LRE = Least Restrictive Environment

The analysis below, based on a study of the most recent OSEP monitoring report issued for each state, summarizes the findings of noncompliance for each state in seven areas.

a. Methodology
The most recent OSEP monitoring report of every state was reviewed and analyzed. These reports were issued between 1994 and 1998. Seven key areas of legal requirements were analyzed for each state: (1) FAPE, (2) LRE, (3) IEP, (4) transition, (5) general supervision, (6) procedural safeguards, and (7) protection in evaluation. These were requirements that OSEP had chosen to monitor in most of the states, which had been monitored fairly consistently across states over time.

b. Standards Used by OSEP for Determining Noncompliance
It should be noted that the charts and tables throughout this section depict findings of noncompliance in the indicated areas for each state, but not the extent of noncompliance represented by that finding. 

The OSEP monitoring process has had no measurable benchmarks or clear criteria for distinguishing the severity of LEA noncompliance with any given requirement. OSEP reported that it made a finding of noncompliance in a state only when such noncompliance was "systemic," meaning that it had occurred "with some frequency,"[113] although there was no regulation or documented policy, guidance, or internal procedure stating this particular criterion. Indeed, the "systemic" criterion, even as OSEP defined it, was not consistently applied in making determinations of noncompliance."[114] This lack of consistency in how findings of noncompliance were made seemed at variance with the compliance standard for SEAs as articulated in the law and in OSEP's own communication to the states (see following discussion).

IDEA requires the SEA to "ensure" that the law's requirements are met by all educational programs that are, or should be, delivering special education services to students with disabilities.[115] In the 1997 Texas Monitoring Report, OSEP clarified the scope of the SEA's full responsibility for ensuring compliance, regardless of the methods the SEA might have used to identify and "count" deficiencies for correction.

"The procedures for TEA's District Effective Compliance system (Reference Guide, September 1996) state that, 'a discrepancy will be cited during the on-site review when it is determined that the violation in question occurs systemically throughout a campus, a district, or a cooperative... As a general rule, a discrepancy will be cited when a violation is found in 30 percent or more of the student programs reviewed.... Violations of "a more serious nature"...are to be cited whenever a single violation occurs. Otherwise, violations that occur in less than 30 percent of the files sampled are not cited, and TEA requires agencies to take no corrective action.'

"Although a state educational agency has some discretion about the method it uses to identify and ensure correction of deficiencies, it is responsible for ensuring that all Part B requirements are met by subgrantees for all students with disabilities. TEA must identify and document all noncompliance found through its monitoring process, even where the violation does not reach the 30 percent threshold, or does not meet the definition for "violations of a serious nature." Further, although corrective action that TEA requires may vary depending upon how isolated or systemic a finding is, it must ensure correction of all identified noncompliance."[116]

In this monitoring report, OSEP communicated the expectation that Texas' corrective action on this issue was to monitor such that all deficiencies were identified and corrected, "regardless of the prevalence or magnitude of those findings."[117] OSEP's finding and explanation made clear that it was the responsibility of the SEA to ensure correction of any occurrence of noncompliance with IDEA. Insofar as the SEA failed to ensure that all Part B requirements have been met, the SEA was not in compliance with IDEA.

Although OSEP articulated a clear standard with respect to findings of noncompliance, it emphasized that the severity and extent of noncompliance varied with each finding. A finding might have been based on an egregious problem or on a technical deficiency of a less serious nature (i.e., a finding of noncompliance with the procedural safeguard requirements might have been based on (1) a wholly ineffective due process hearing system, or (2) the state's failure to provide a fully accurate explanation of a procedural safeguard as part of its required notice to parents).[118] Likewise, a noncompliance finding might also have been based on several to many instances of noncompliance with a requirement. These variations in the severity and extent of a noncompliance finding, however, do not lessen the responsibility of the SEA for identifying and ensuring that all instances of noncompliance are corrected.

c. Summary of State Noncompliance Findings
Chart 4 below indicates how many states failed to ensure compliance in each of the listed areas according to the most recent monitoring report for each state. The largest areas of noncompliance were general supervision, where 90 percent, or 45 states, failed to ensure compliance, and transition, where 88 percent, or 44 states, failed to ensure compliance. Other key noncompliant areas were FAPE, where 80 percent, or 40 states, failed to ensure compliance, and LRE, where 72 percent, or 36 states, failed to ensure compliance. Table 5 provides a state-by-state display of areas out of compliance. Thirty states failed to ensure compliance in five, six, or seven areas of IDEA requirements considered by this report. Appendix G provides a one page summary of the noncompliant findings for each state from its most recent monitoring report.

Chart 4: Number and Percentage of Noncompliant States in Each Area 
According to 1994-1998 OSEP Monitoring Reports
Area of Noncompliance States Out of Compliance
Number of States Percentage of States
General Supervision 45 90%
Transition 44 88%
FAPE 40 80%
Procedural Safeguards 39 78%
LRE 36 72%
IEPs 22 44%
Protection in Evaluation 19 38%

[Table 5: State Noncompliance as Reported by 1994-1998 Monitoring Reports [119] not available.]

In the analysis of the fifty state monitoring reports below, each of the monitored requirements is described briefly with a summary of the findings from all fifty reports, followed by examples from the reports to illustrate the basis for OSEP's noncompliance findings.

d. Analysis of Findings of Noncompliance
i. Free Appropriate Public Education
FAPE gives children with disabilities access to the supports and accommodations they need to obtain an education, requiring that special education and related services be made available to them in accordance with their IEPs. OSEP found that 40 states (80%) had failed to ensure compliance with the FAPE requirements. Specific FAPE requirements and the percentage of states in noncompliance are illustrated in Chart 6:

[Chart 6: State Noncompliance with FAPE Requirements not available.]

(a) Extended School Year
ESY services must be made available to individual students who require such services in order for them to be receiving FAPE. This requirement recognizes that some students with disabilities will not receive an appropriate education unless they have special education or related services during the summer months.

OSEP found that 28 states (56%) had failed to ensure compliance with the ESY requirements, as shown in the following examples:

In Alabama, . . . [i]nterviews with teachers and administrators in public agencies A, B, and D revealed that extended school year was not available for students in the facilities visited by OSEP. Teachers interviewed ... stated that they were unsure as to the criteria for extended school year, and therefore did not know how to determine the need for extended school year services. None of these 11 teachers had ever participated in an IEP meeting where students were considered for such services. Both building level and district administrators... confirmed that teachers and administrators were not aware of the criteria for extended school year services.[120]

In four out of five public agencies visited in Iowa, OSEP determined that ESY services were not considered on an individual basis and provided to students who required them.[121]

In Delaware, OSEP found that availability of ESY services was restricted to students with autism and those who received "Level 5" services. Participation of other students in ESY services was not determined based on the IEP, and in some of the agencies visited it was not available to other students at all.[122]

In four of the five agencies visited in Connecticut, "...children with particular types of disabilities were categorically excluded from consideration for ESY services."[123]

Two teachers in an agency in Arkansas reported that the agency did not offer ESY and that it was never discussed at any IEP meeting they attended.[124]

(b) Related Services
Students with disabilities must be provided with related services such as occupational therapy, speech therapy, physical therapy, and psychological counseling based on their individual needs as reflected in their IEPs. This requirement recognizes that without these related services, some students with disabilities cannot adequately access and learn their curricular materials.

OSEP found that 34 states (68%) had failed to ensure compliance with the related services requirements, as shown in the following examples:

In Florida, ...OSEP was informed in interviews with district and building-based administrators, teachers, and related services personnel in Agencies F, G, and H that psychological counseling, as a related service, is not available to students with disabilities, regardless of need. A building-based administrator in Agency E indicated that many students need psychological counseling but it is not available as a related service.

...OSEP was informed by two related service providers in Agency G that they were instructed not to list individual therapy on their caseload(s). They stated that they will provide the service informally, but it is not reflected on the student's IEP (there are no goals and objectives).

...A special education teacher in Agency H told OSEP that students may have to go to a center-based or day program if they need more intense counseling services.[125]

In one agency in Minnesota, OSEP found that psychological counseling was not considered for inclusion in any student's IEP.[126]

An administrator from an agency in Arizona confirmed "that related services (speech therapy, occupational therapy, and physical therapy) are not based on the individual student's needs but are based upon the availability of the service provider."[127]

Administrators and teachers from two agencies in Oklahoma stated that psychological counseling services are not provided based on an IEP, even if a child needs such services to benefit from special education.[128]

In one district in California, an administrator told OSEP that there were 42 students whose IEPs called for speech services, but who were not receiving the services; in another district, an administrator reported that students whose IEP teams believed they needed mental health services to benefit from special education were referred to outside agencies for the services, rather than receiving the services free of charge through their IEPs.[129]

(c) Length of School Day
Unless their individual needs dictate otherwise, the length of the school day for students with disabilities must meet their state's general standard.

OSEP found that five states (10%) had failed to ensure compliance with this requirement, as shown in the following examples:

Administrators in two districts in Delaware reported that 17 students had their school days shortened by an hour and a half due to " transportation schedules."[130]

In Arkansas, ...[b]ecause there were not enough modified buses in the agency to transport students with disabilities, an administrator in Agency C reported that six students received one hour fewer per day than the state standard.

One administrator reported and another administrator confirmed that a classroom of children with disabilities in Agency B had their school day shortened by 30 minutes per day, which was less than the state standard, because students in a self-contained program were transported from the school where their classroom was located to their 'home school' in order to catch the regular bus.

An Agency J administrator reported to OSEP that four children with disabilities who attended the vocational technical program were in school one hour fewer than the state standard because of the time needed to transport them from another district. As a result, these children were only able to get two hours of credit for their vocational class at Agency J--instead of the normal three hours of credit.[131]

(d) Provision of Special Education/Program Options Available
Students' IEPs must set forth with specificity the amount of special education and related services the students are to receive. These decisions must be based upon individual need. In addition, program options that meet their needs must be made available to students with disabilities.

OSEP found that 15 states (30%) had failed to ensure compliance with these requirements, including the following examples:

In [Pennsylvania] public agency C, six of seven records reviewed by OSEP had no specific statements of special education or related services.[132]

In Connecticut, ...OSEP found that the technical vocational education such as that provided through the state-operated regional schools was not an available program option for students with moderate or significant disabilities. OSEP confirmed through interviews that although some high school students could benefit from technical vocational education available only at the regional programs, this option was not available to certain students with disabilities.[133]

In Kentucky, OSEP found that 22 of 53 IEPs reviewed, in three of the four agencies visited, either did not state the specific amounts of special education and related services or stated the amounts in ranges. Individuals interviewed reported that the amount of services was not based upon individual student needs. In addition, twelve of the 53 students were not receiving services that conformed to their IEPs. [134]

In Ohio, OSEP reviewed 94 student records in 11 of the 12 agencies visited, and identified 75 cases in which the amount of special education and related services was either not recorded on the IEP or the services were stated in ranges. Teachers, related service providers, and agency administrators reported that the amount of services was stated as a range because the lesser amount reflected state minimum standards, while the greater amount indicated the child's actual need. The child would receive the amount of services needed if the therapist had time to provide it; if not, the child received the lesser amount.[135]

ii Least Restrictive Environment
LRE requirements hold that students with disabilities should be educated, to the maximum extent appropriate, with their nondisabled peers. Separate schooling or separate classes or other removal of children with disabilities from the regular educational environment must take place only when the nature or severity of the disability is such that education in regular classes with the use of supplementary aids and services cannot be satisfactorily achieved.

OSEP found that 36 states (72%) had failed to ensure compliance with the LRE requirements. It is interesting to note that of the remaining fourteen states, OSEP found six states not out of compliance on LRE, but provided no information at all on LRE compliance for the other eight states. In all six states found not out of compliance, the finding was based on site visits that had not included any separate facilities. Such facilities have been sources of findings of LRE noncompliance in many states.

It was also noteworthy that during this period of time, OSEP conducted monitoring visits at only three state schools for students who are deaf or have visual impairments,[136] and only three separate private facilities. These sorts of facilities have powerful political constituencies, both nationally and in many states. It is of particular importance that OSEP monitored such facilities because states sometimes have failed to exercise their general supervisory authority over them. In Kentucky, for example,

"[a]t the time of OSEP's 1992 Monitoring Report, KDE [Kentucky Department of Education] acknowledged that it had not monitored the Kentucky School for the Deaf and the School for the Blind for approximately 10 years. Comments received at the public meetings held in June prior to OSEP's September 1995 on-site visit indicated that KDE maintains a "hands off" policy toward both state schools and that KDE has not yet monitored either school even though OSEP's 1992 report had cited KDE for failure to exercise general supervisory authority over these programs. During OSEP's 1995 monitoring visit, KDE administrators acknowledged that they had failed to exercise their general supervisory responsibility for these programs in that the Kentucky School for the Deaf had not yet been monitored by KDE for compliance.... 

Although the Kentucky School for the Deaf was conducting a self-study during the 1995-96 school year in preparation for an on-site monitoring visit during the 1996-97 school year, and the Kentucky School for the Blind had received an on-site monitoring visit in March 1995 and a follow-up visit in September 1995, at the time of OSEP's visit, KDE could not provide OSEP with documentation to verify that special education programs for children enrolled in these schools meet state and federal requirements."[137]

Finally, there was no evidence in the text of any of the reports indicating that OSEP reviewed the files of students placed in out-of-state residential facilities for LRE compliance. Without such review, it was difficult to determine OSEP's basis for the following conclusion: "During the 1992-1993 school year, Iowa Department of Education (IDE) placed approximately 200 students in out-of-state programs, based upon their unique needs."[138] Specific LRE requirements and the percentage of states in noncompliance are illustrated in the following chart:

[Chart 7: State Noncompliance with LRE Requirements not available.]

(a) Education with Nondisabled Students/Removal Only When Aids and Services Standard Met
Students with disabilities must be educated with nondisabled students to the maximum extent appropriate to meet their needs. Removal from less restrictive settings can occur only if students' IEPs cannot be implemented in those settings, even with the use of supplementary aids and services.

However, OSEP found that 32 states (64%) had failed to ensure compliance with these requirements, including the following examples:

OSEP found that in two districts in Mississippi, regular class placements were not discussed at annual review or IEP meetings for some students with disabilities. One teacher told OSEP that this did not occur "even though some of the students this teacher serves could probably perform satisfactorily in some of the regular academic classes."[139]

Administrators and teachers in three districts in Delaware told OSEP that these LRE requirements were not followed in their districts because the state's funding formula was a disincentive to regular class placements for students with disabilities.[140]

In Idaho, "....OSEP found that the removal of children with disabilities from regular education programs in public agency B was not based on a determination that the nature or severity of the disability is such that education in regular classes with the use of supplementary aids and services could not be achieved satisfactorily, but, rather on administrative convenience. A special education teacher of a self-contained program for students with moderate to severe/ profound disabilities ... stated, 'These students have been here forever. This is where they have been and this is where they are going to be.' She further stated that other options in less restrictive settings are not explored or considered by the IEP team."[141]

In Iowa, [t]wo...administrators responsible for the administration and supervision of programs in public agency E stated that the consideration of the supplementary aids and services needed by a student with disabilities is "not part of the IEP process."[142]

(b) Nonacademic and Extracurricular
Students with disabilities must participate with nondisabled peers in nonacademic and extracurricular activities and services to the maximum extent appropriate to their needs.

OSEP found that 29 states (58%) had not ensured compliance with these requirements, as shown in the following examples:

In New York, "[t]he special education director and a program administrator in public agency F informed OSEP that there was no individualized determination of the maximum extent to which each student with a disability placed in the BOCES' center-based (separate school) programs could participate with nondisabled children in nonacademic and extracurricular services and activities, and that there were currently no opportunities for such integration, regardless of individual student need."[143]

In South Carolina, "OSEP determined in interviews with administrators in agencies C and G that the participation of students with disabilities with nondisabled peers in nonacademic and extracurricular activities was not determined on an individual basis. The administrator in agency G reported efforts on the part of the agency to involve disabled students in nonacademic and extracurricular group activities at neighboring regular education schools. However, participation was not based on the individual needs of students, but on the activities (e.g., assemblies) being available to the entire class of special education students as a group activity. The administrator in agency C stated that participation in nonacademic and extracurricular activities is not occurring for most of the students enrolled in the agency C separate facility, even though these students could benefit from participation in nonacademic and extracurricular activities with nondisabled peers."[144]

In California, three administrators reported that "students identified as seriously emotionally disturbed who are served in a separate school program in the district, and students with disabilities who are served in the agency's preschool program (separate school), are not provided adequate opportunities for integration with age appropriate peers, regardless of individual need. [These administrators] reported to OSEP that as a general practice there was no individualized determination of the maximum extent to which each student with a disability placed in the separate school programs could participate with nondisabled children in nonacademic and extracurricular services and activities."[145]

(c) Placement Based on IEP
Placement decisions for students with disabilities must be based on their IEPs. The practice of not basing placement decisions on students' IEPs can have the effect of depriving some students with disabilities of access to schools attended by their friends and neighbors.

OSEP found that 19 states (38%) had failed to ensure compliance with this requirement, including the following examples:

An agency administrator in Ohio stated that "approximately 25 percent of the students who are placed into special education programs are placed prior to the development of their IEPs. A teacher [in the same agency] high school visited by OSEP stated that placements were based on parent request, administrative convenience, or category of disability, rather than on the students' IEPs."[146]

In Iowa, "[b]oth teachers interviewed by OSEP in the school visited in agency B indicated that placement is determined prior to the development of a student's IEP.

Two of the four teachers interviewed by OSEP in agency C indicated placement is determined prior to the development of a student's IEP.

An administrator and two teachers from the elementary school in agency D told OSEP that, for both initial and subsequent placements, placement is determined prior to the development of the student's IEP."[147]

In Connecticut, "OSEP found that students with moderate, significant, or profound disabilities are not permitted to attend the high school that agency D nondisabled students attend. Special education teachers, the administrator of the middle school, the administrator responsible for supervising the provision of special education services in agency D and a school nurse, and the PPT minutes in student records confirmed that placement practices for these students were not based on the student's IEP, but rather on the student's IQ, program location and availability of related services (e.g., medical services)."[148]

(d) Continuum Available to Extent Necessary
A continuum of placement options must be available to students with disabilities to the extent necessary to implement their IEPs. The lack of availability of a full continuum of placement options can have the effect of forcing students into placements that are more restrictive than necessary to implement their IEPs.

OSEP found that 17 states (34%) had failed to ensure compliance with this requirement, including the following examples:

Teachers and a building-level administrator in a Rhode Island public agency told "OSEP that, at their school, full-time regular education placement . . . was not a continuum option for any students with disabilities. At [a second public agency], three teachers told OSEP that full-time regular education was not a continuum option for any of the students with disabilities attending the school that OSEP visited. Administrators and teachers at [a third agency] told OSEP that currently, full-time regular education placement was not an option in the district."[149]
The inability or unwillingness of school districts to provide a full continuum of placement options also can have the effect of forcing students into placements that are more restrictive than necessary to implement their IEPs:
In New Jersey, "[a]n administrator stated that the Child Study Team ... looks at a student's classification at the annual review and determines whether or not a student is eligible for Resource Room services. A teacher and administrator further elaborated that the Resource Room option is limited to two periods a day. If more time is required, the student is placed in a self-contained classroom for a full day. There are no other options for resource service for more than two periods or less than a full day."[150]
(e) Placement Determined at Least Annually
Placement decisions for students with disabilities must be made at least annually. Failure to re-evaluate placement annually can result in continuing placements that no longer meet the educational and related service needs of the child.

OSEP found that eight states (16%) had failed to ensure compliance with this requirement, including the following examples:

"An administrator and two teachers from public agency C in North Carolina informed OSEP that placement determinations are reviewed after the triennial re-evaluation unless the child's parents want a program change prior to the re-evaluation. An administrator and one teacher from public agency D stated that placements for students with disabilities are determined at the time of initial placement into the special education program and thereafter at three-year intervals coinciding with the time of the student's re-evaluation, unless special circumstances arise indicating that a change may be needed. Teachers from public agencies F and H told OSEP that the IEP team does not reconsider the student's placement until the student is ready for a higher functioning program, or the student 'ages out' to the next level."[151]

In Georgia, "[w]hen asked how often placement determinations for students with disabilities are made, three administrators and four teachers from agencies A, D, and E informed OSEP that placement options are considered at initial placement and at triennial meetings, but not at annual reviews. 'At annual reviews, we just look at goals and objectives' explained a teacher from agency A."[152]

iii Individualized Education Programs
IDEA requires that all students have an individualized education program that documents (1) their current level of performance, (2) their goals and objectives, (3) the services to be provided to meet those needs, (4) the dates for initiation of services and anticipated duration, (5) criteria for determining the extent to which objectives are being met, and (6) transition service for students aged 16 and older.

OSEP found that 22 states (44%) had failed to ensure compliance with the IEP requirements. Specific IEP requirements and the percentage of states in noncompliance are illustrated in the following chart (Chart 8):

[Chart 8: State Noncompliance with IEP Requirements not available.]

(a) IEP Content
IEPs for students with disabilities must address their unique individual needs and must include students' present levels of performance; annual goals; short-term objectives; and evaluation criteria, procedures, and schedules. IEPs must also include the extent to which students will participate in general education programs.

OSEP found that 20 states (40%) had failed to ensure compliance with the IEP content requirements. The failure to base IEPs on the unique individual needs of students is also demonstrated by goals and objectives that do not correspond to the needs identified by students' IEPs. For example,

"OSEP's comparison of 17 IEPs in a New Jersey agency showed identical goals and objectives for 16 children. A teacher stated that all students were taught the same skills and that the goals were based on the curriculum. During the review of one IEP, OSEP discovered that a goals and objectives page had the name of another student on it. School personnel were unable to explain this discrepancy.

OSEP reviewed another student record that showed the same goals and objectives for three years. In another agency, a comparison of 12 IEPs showed identical goals and/or objectives for six children enrolled in a job orientation program. A teacher for three of the students stated that even though the IEP goals and objectives were identical in the children's IEPs, the children's needs were not identical. Another teacher for the other three children in that same agency told OSEP staff that the IEP short-term objectives were identical and did not address individual students' needs in terms of their participation in the job-orientation program."[153]

The failure to base IEPs on the unique individual needs of students is also shown by goals and objectives that do not correspond to the needs identified by students' IEPs:
In Kentucky, "[f]ourteen of the 53 IEPs reviewed by OSEP did not include goals and objectives to address each of the students' needs identified on the IEP. OSEP found that IEPs did not contain goals and objectives related to students' needs for instruction in special education settings or for related services such as speech therapy."[154]
States' violations of IEP content requirements are often fairly widespread. The following table displays the number of IEP deficiencies as the numerator and the total number of IEPs reviewed as the denominator for five states:

[Table 9: State Noncompliance with IEP Content Requirements in Five States Requirement not available.]

(b) IEP Meetings
IEP meetings must include a representative of the public agency--other than the student's teacher--who is qualified to supervise or provide special education and the student's teacher. The meetings should also include the student, if appropriate, and may include other individuals at the discretion of the parent or agency. Agencies must take steps to ensure that the student's parent(s) participates in meetings, including giving timely notice of meetings, scheduling meetings at mutually convenient times and places, and using other methods to ensure parent participation when parents cannot attend.

OSEP found that 13 states (26%) had failed to ensure compliance with the IEP meeting requirements, including the following example:

In Massachusetts, "...OSEP was informed by four agency administrators, eight building administrators, and nine teachers in six public agencies...that one person, usually the educational programmer or the student's special education teacher, develops the goals and objectives after the IEP meeting. ...OSEP finds that this practice is inconsistent with...the requirement that one or both of the child's parents...must participate in the development of the child's IEP...."[155]
iv Transition Services
Students age 16 and older (and younger if deemed appropriate) must have IEPs that include a statement of needed transition services.

OSEP found that 44 states (88%) had failed to ensure compliance with the transition requirements. Specific transition requirements and the percentage of states in noncompliance are illustrated in the following chart:

[Chart 10: State Noncompliance with Transition Requirements not available.]

(a) Notice
If a purpose of an IEP meeting is the consideration of transition services, the notice of the meeting must indicate this purpose, indicate that the student will be invited, and identify any other agencies that will be invited.

OSEP found that 35 states (70%) had failed to ensure compliance with the transition notice requirements. For example,

In North Carolina, "OSEP found that in most instances [the total in all agencies was 23 of 27 IEP notices] the notices used by four public agencies to inform parents of IEP meetings did not specify that a purpose of the meeting is the consideration of transition services, when those notices were for meetings for students who were 16 years or older."[156]
(b) Meeting Participants
If a purpose of an IEP meeting is the consideration of transition services, invitees must include the student and representatives of other agencies likely to be responsible for providing or paying for transition services. If the student does not attend, the public agency must take steps to ensure that the student's preferences and interests are considered.
"I've never been asked, 'Hey, what's your perspective? What can I do to make your education better?' And I feel like you can ask the parents all you want, but if you really want to get down to the heart of the problem and how the students are being affected, maybe you should ask them first." - A high school senior with a disability from South Carolina on having input to the IEP[157]
OSEP found that 38 states (76%) had failed to ensure compliance with these requirements, including the following examples:
In two New Hampshire public agencies, in 14 of 17 records reviewed by OSEP for students 16 years or older, the student was not invited to the IEP meeting.[158]
In Massachusetts, "OSEP reviewed the files of 18 students ages 16 and older in public agencies A, E, and F, and found that three of six students in agency A, four of six in agency E, and three of six in agency F did not attend their most recent IEP meeting. Four teachers and an administrator responsible for the administration and supervision of special education programs in those agencies told OSEP that they do not invite the student to the IEP meeting even if one of the purposes of the meeting is the consideration of transition services.
Three administrators responsible for the administration and supervision of special education programs, four building level administrators, and three teachers in public agencies A, E, and F told OSEP that there is no procedure for ensuring that the preferences and interests of the students are considered during the development of the statement of needed transition services."[159]
(c) Statement of Needed Services
The IEPs of students 16 and older, and of those who are younger if appropriate, must contain a statement of needed transition services, including (1) activities in instruction, (2) community experiences, (3) employment, and (4) adult living.

OSEP found that 34 states (68%) had failed to ensure compliance with these requirements. For example,

In Missouri, "OSEP found that out of a total of 42 IEPs of students 16 or older, 15 IEPs...contained no statements of needed transition services... An agency administrator explained to OSEP that the district has not done a good job on transition and that it is not district practice to provide transition services to post-secondary education for students with mild disabilities, such as learning disabilities."[160]

In Colorado, "[b]ased on a review of records for age-appropriate students in two agencies, OSEP found that 11 of 21 IEPs... did not contain statements of needed transition services or included incomplete statements of needed transition services. Incomplete statements... omitted services in one or more of the areas of instruction, community experiences, and employment/other post-school adult living objectives, and did not include a statement that the IEP team had determined that the student did not need services in those areas and the basis for that determination...."[161]

In New Hampshire, "public agencies A and E, in 16 of 17 records reviewed by OSEP for students 16 years or older, student IEPs did not include a statement of needed transition services or any information related to the provision of transition services...."[162]

v. General Supervision
The general supervision of the implementation of IDEA Part B requirement means that states must ensure the development and use of mechanisms and activities in a coordinated system to (1) ensure the states' mechanisms for monitoring compliance with FAPE, LRE, and other IDEA requirements are coordinated and result in the correction of identified deficiencies; (2) ensure that educational and support services are provided to eligible students involved in juvenile and adult detention and correctional facilities, state operated programs (i.e., schools for the developmentally disabled, blind, or deaf), and out-of-district placements; and (3) ensure appropriate and timely service delivery based on interagency coordination and assignment of fiscal responsibility. General supervision also ensures that decision-making regarding these mechanisms and activities is based on collection, analysis, and utilization of data from all available sources (i.e., complaint investigations and resolutions, due process determinations, mediation agreements, court decisions, etc.). Some of the monitoring reports during the period of time under study treat all of these issues as part of general supervision, while others do not.

OSEP found 45 states (90%) failed to ensure compliance with general supervision requirements. Specific general supervision requirements and the percentage of states out of compliance are illustrated in the following chart:

[Chart 11: State Noncompliance with General Supervision Requirements not available.]

(a) Incarcerated Students
States must ensure that all individuals with disabilities ages three through 21 are identified, located, evaluated, and provided FAPE.

DoED found 18 states (36%) failed to ensure compliance with these requirements, including the following example:

"California Department of Corrections administrators responsible for educational programs in correctional facilities cited a recent study by that Department estimating that there are 6500-8500 youth with disabilities between the ages of 16 and 22 in the Department's facilities who would be eligible for special education and related services under current California law. They stated that the Department of Corrections currently offers adult basic education and literacy programs to assist inmates in attaining a high school diploma or high school graduation equivalency diploma, and provides adult literacy offerings, but that special education services are not currently available in any of the 29 facilities that house youth between 16 and 22."[163]
(b) Complaint Resolution

OSEP found 24 states (48%) failed to ensure compliance with the complaint resolution requirements. These requirements and the percentage of states out of compliance are illustrated in the following chart:

[Chart 12: State Noncompliance with Complaint Resolution Requirements not available.]

(i) Resolved within sixty days
Unless exceptional circumstances exist with respect to a particular complaint, states must resolve complaints within 60 calendar days.

DoED found 18 states (36%) failed to ensure compliance with the complaint time line requirement. Moreover, states sometimes exceed the mandated time line for large numbers of complaints. For example,

In Minnesota, "...MDE [Minnesota Department of Education] did not resolve 58 of the 100 complaints, received during the 1993-94 school year, within 60 days...."[164]

"Based on a review of the Pennsylvania Department of Education's [PDE's] complaint log for the period beginning January 1, 1991, and ending December 31, 1992, OSEP finds that 512 complaints were filed with PDE, and that in 168 cases PDE did not investigate and resolve the complaints within 60 calendar days after they were filed. OSEP reviewed a sample of 16 complaint files where PDE exceeded the 60-day time limit and found that 14 of those files did not contain documentation of an extension due to exceptional circumstances with respect to a particular complaint." [165]

(ii) Resolve any complaint
States must resolve every allegation in each complaint.

DoED found nine states (18%) failed to ensure compliance with this requirement. Some states have refused to investigate certain types of complaints. The effect of the complaint limitations imposed by some states has been to force parents either to drop the issue or to hire attorneys to represent their children in due process hearings. Some examples include the following:

In Kansas, "KSBE has no written policy or guidelines outlining its procedures for conducting complaint investigations. KSBE officials informed OSEP that KSBE does not issue a report outlining its findings when the complaint involves 'IEP team decisions.' IEP team decisions are defined by KSBE to include appropriateness of identification or placement decisions, or appropriateness of decisions involving types and amount of services. KSBE limits its complaint resolution to procedural issues alleging state or federal violations, such as whether the district is providing the type and amount of services listed on an IEP or whether the service providers meet specific state or federal criteria. When KSBE determines that a complaint is substantive rather than procedural, the parents are contacted, usually via phone, and advised that their appropriate avenue of relief is through a due process hearing. KSBE officials stated that records of requests for complaint investigation that are denied are not kept by KSBE. In the file of one complaint, OSEP found the following notation: 'This is not an issue which can be adjudicated through the formal complaint process, as the State Department of Education will not substitute its judgment for that of the IEP team. Therefore, no corrective action is required pursuant to this issue.'"[166]

In North Dakota, "OSEP found that in one complaint the issues raised by the parent regarding the provision of special education services for his daughter were investigated as if there were the possibility of a systemic problem within the unit and district policies and procedures that may have affected all children receiving special education services. Further, the written report addressed findings related to general policies affecting all children with disabilities rather than the individual circumstances of the complainant. Therefore, there was no investigation and resolution of the specific allegations of the complaint."[167]

The effect of the illegitimate complaint limitations imposed by some states has been to force parents either to drop the issue, or to hire attorneys to represent their children in due process hearings.

(c) State Monitoring
OSEP found 35 states (70%) failed to ensure compliance with the state monitoring requirements. These requirements and the percentage of states in noncompliance are illustrated in the following chart:

Chart 13: State Noncompliance with State Monitoring Requirements
Requirement % of States Out of Compliance Number of States Out of Compliance
Method of Determining Compliance
Lacked methods to determine compliance with some requirements 44 22
Lacked complete methods 38 19
Effective Method for Identifying Deficiencies
Lacked effective methods for identifying deficiencies 42 21
Correction of Deficiencies
Failure to ensure correction of deficiencies 56 28

(i) Method/completeness of method to determine compliance
States must adopt proper methods to monitor public agencies responsible for carrying out special education programs.

OSEP found 22 states (44%) lacked methods to determine compliance with some requirements, and 19 states (38%) lacked complete methods, including the following examples:

No method to determine compliance: "...OSEP reviewed AZDE's [Arizona Department of Education's] monitoring procedures document, Monitoring for Effectiveness of Compliance -- Master Guide, the Collaborative Program Review manual, and all other monitoring procedures and materials, and finds that the procedures that were in effect at the time of OSEP's visit did not include a method to determine compliance regarding the following requirements: §300.571--Consent for release of confidential information, §300.540-- Additional team members--SLD." [168]

Incomplete methods to determine compliance: "...§300.300--FAPE--Extended School Year services (ESY) - AZDE's monitoring procedures contain an element at 5.C.5.v that requires that "the IEP shall include consideration for extended school year services," and monitors are directed to review the IEP to determine if ESY services have been considered. There are no guidelines for determining the need for ESY and, in some cases, documentation on the IEP is limited to checking "yes" or "no" in response to the provision of ESY services. As a result, AZDE's method does not enable monitors to determine if the decision about the need for ESY is made on an individual basis at the IEP meeting, rather than on the category of disability or the program in which the student is enrolled."[169]

(ii) Effective method for identifying deficiencies
States must use proper methods to monitor public agencies responsible for carrying out special education programs.

OSEP found that 21 states (42%) lacked effective methods for identifying deficiencies. The methodology OSEP has used to make findings of noncompliance in this area has been to monitor public agencies recently monitored by the SEA. Findings are made if OSEP finds noncompliance with requirements that the SEA missed in its monitoring effort. For example,

"Although the Virginia DOE's [Department of Education's] monitoring instruments include elements that address all of the Part B requirements regarding placement in the least restrictive environment, OSEP found that VADOE's monitoring procedures had not been fully effective in determining compliance with all of those requirements. OSEP identified deficiencies in three agencies regarding placement in the least restrictive environment that VADOE did not identify when it conducted its most recent review of those agencies."[170]
Occasionally findings of noncompliance with the requirement to have effective methods for identifying deficiencies are based upon a failure to monitor districts regularly:
In Texas, "[d]uring the 1992-93 through the 1995-96 school years, Texas monitored 108 of its 1,065 districts. Only districts that volunteered to participate in the pilot were reviewed using the Results Based Monitoring system. With the exception of a few follow-up reviews resulting from previous comprehensive monitoring reviews, TEA's comprehensive cyclical monitoring was discontinued after the 1991-92 school year. As a result, 541, roughly half of Texas's districts, received only one visit between the 1986-87 and 1995-96 school years. Two-hundred five of these districts had not been monitored in eight or more years."[171]
(iii) Correction of deficiencies
States must adopt and use proper methods for the correction of deficiencies in program operations that are identified through monitoring.

OSEP found that 28 states (56%) had failed to ensure the correction of deficiencies identified through their monitoring processes. OSEP's methodology on this issue has been to visit agencies that the SEA had recently monitored, had made findings of noncompliance, and had verified that corrective actions were performed. Findings were made by OSEP if it discovered continuing noncompliance with the requirement at issue in the agency visited. On occasion, OSEP had discovered that one of the reasons for the continuing noncompliance was that the SEA had approved corrective actions that were inadequate to remedy the noncompliance. For example,

"...OSEP found in May 1995 that agencies A, C, D, and F were failing to complete a number of pre-placement evaluations within the state's 60 school day standard, although ISBE [Illinois State Board of Education] had found this deficiency in agency A in 1993, agency C in 1990, agency D in 1988, and agency F in 1989, and required each agency to correct the identified deficiencies ...."[172]

"OSEP noted in monitoring documents maintained by the Indiana Department of Education (IDE) that it had not ensured that subsequent to districts being monitored, the necessary actions to correct identified deficiencies were implemented by public agencies, nor had IDE ensured that noncompliant practices were discontinued. ... OSEP found similar deficiencies in public agencies that IDE had monitored, identified deficiencies, and subsequently verified that corrective actions had occurred. In addition, some deficiencies in agencies monitored by OSEP during its 1992 monitoring visit reappear in this Report. IDE had previously provided written assurances and documentation that deficiencies identified by OSEP in these agencies had been corrected."[173]

"Both OSEP and LDE [Louisiana Department of Education] identified some of the same noncompliance activities regarding LRE in agencies B, C, D, and E .... In two instances the corrective action plan directed the LEA to provide in-service training to staff and to allow for more opportunities for students to interact with nondisabled peers. These activities were completed, but some students continue to lack any opportunities to participate with nondisabled students for academic, nonacademic, or extracurricular activities. In one instance the facility was to develop an interagency agreement. This was accomplished, but the placement process continues to disallow individual determinations of the maximum extent to which students can be educated with nondisabled students." [174]

In California, OSEP noted that "... many deficiencies identified in agency F in CDE's [California Department of Education's] 1993 review and OSEP's 1991 review were uncorrected. CDE required agency F to submit corrective action materials in the form of completed compliance resolutions or compliance agreements after its 1993 review. ... CDE approved all compliance resolution materials .... The corrective actions submitted by agency F and approved by CDE, required agency F to change its policies and procedures to make them consistent with state and federal requirements, but did not require training or other procedures to ensure that practice was changed or documentation to ensure that deficiencies had been corrected on an individual and/or systemic basis. ... CDE also conducted a follow-up visit required by the OSEP corrective action plan. CDE focused its follow-up on deficiencies identified by OSEP in its 1992 Report and found that agency F had corrected these findings. CDE's follow-up review, however, only confirmed that public agencies had established policies and procedures that were consistent with the requirements ...; CDE did not investigate whether public agencies implemented these requirements, and OSEP found as part of its 1995 review that agency [F] continued to implement practices that were not consistent with these requirements."[175]

(iv) Limitations of monitoring findings on the compliance of state monitoring systems
Federal monitoring findings on state monitoring should be regarded as low estimates of the number of states that have not complied with the state monitoring requirements. In each of the following examples, the federal monitoring reports appeared to contain enough information and analysis to support findings of noncompliance with state monitoring requirements, yet none expressed a clear-cut finding of noncompliance.
In its 1997 Alaska monitoring report, OSEP made the following determination:

"... AKDE [Alaska Department of Education] monitors for this requirement [FAPE-- related services] by reviewing current IEPs ..., and verifying that services are implemented as written on the IEP, but does not have a method to determine how decisions are made regarding provision of needed related services. OSEP also reviewed the most recent monitoring reports issued by AKDE for each of the public agencies to be visited. OSEP determined that AKDE did not make any findings with regard to the provision of related services...in any of these agencies."[176]

OSEP, however, had found noncompliance with this requirement in three agencies in Alaska, thus providing the basis for a finding of noncompliance concerning the effectiveness of the method for identifying deficiencies requirement. Yet OSEP did not state such a finding in its Alaska report.
In Alabama, OSEP made findings of LRE noncompliance in four agencies; the Alabama SEA had made such findings in only one of these agencies.[177]
Again, however, OSEP did not state a finding of noncompliance concerning the effectiveness of the method for identifying deficiencies.
In addition, in the FAPE section of its Maine report OSEP noted the following:

"In its 1994 monitoring report, OSEP cited MDOE [Maine Department of Education] for monitoring procedures that did not always result in the identification of deficiencies regarding the provision of related services. The specific related services addressed in this finding were psychological counseling and testing services. MDOE was required to revise its monitoring procedures, and take other action to ensure the provision of related services, including psychological services, needed by the child in order to benefit from special education. However, MDOE did not make findings regarding the availability and provision of psychological counseling in any of the monitoring reports for agencies A, B, and G, the agencies in which OSEP identified deficiencies in the 1996 monitoring visit. 
Agency A was monitored by MDOE in 1994, prior to the issuance of OSEP's monitoring report, and the subsequent revisions to the monitoring procedures. Agencies B and G were monitored in 1995 and 1996, after the revision of the monitoring documents...."[178]

Yet OSEP did not state a finding of noncompliance in the area of effectiveness of the method for identifying deficiencies in its 1997 Maine report.
Although in the FAPE section of its South Carolina report, OSEP pointed out the following, again no clear-cut finding of noncompliance with state monitoring requirements was stated:

"Although SCDE's [South Carolina Department of Education's] monitoring procedures require that monitors verify through interview with teachers, related services providers, and parents that the related services specified in the student's IEP are being provided, OSEP found this process ineffective. Monitoring documents maintained by SCDE showed that interviews with teachers and related services providers, as required by SCDE's monitoring procedures, were not always conducted by SCDE monitoring staff to confirm that related services are provided based on the student's IEP."[179]

Finally, OSEP noted in its Tennessee report, concerning pre-placement evaluations, that the SEA made findings of noncompliance in two agencies, and verified corrective actions, yet "its monitoring procedures have not effectively ensured that agencies discontinue noncompliant practices."[180] But OSEP did not make a finding of failure to correct identified deficiencies in its Tennessee report.

The reader will note the similarities between these examples and earlier examples where OSEP made actual findings of noncompliance in state monitoring. Although OSEP later reported it had required corrective actions in each of these instances, it was puzzling that OSEP also had not made clear findings of noncompliance in Alaska, Alabama, Maine, South Carolina, and Tennessee.

(v) Procedural safeguards
Procedural safeguards ensure that parents are notified about and have access to due process. OSEP found that 39 states (78%) had failed to ensure compliance with the procedural safeguards requirements. Specific procedural safeguards requirements and the percentage of states in noncompliance are illustrated in the following chart:

[Chart 14: State Noncompliance with Procedural Safeguard Requirements not available.]

(vi) Hearing decisions within forty-five days
Unless a specific extension of time is granted by a hearing officer, final decisions in hearings must be reached and copies mailed to the parties no later than 45 days after the receipt of the request for the hearing.

OSEP found that 18 states (36%) had failed to ensure compliance with this requirement. Such violations can result in undue delays in students receiving appropriate services or placements. For example,

In Illinois, "OSEP reviewed the decisions and Illinois State Board of Education files for 11 randomly selected due process hearings (each of which was requested between March 1993 and January 1994), and found that the decision in each of the 11 hearings was reached more than 45 days after the hearing was requested. There was no documentation of a time line extension for seven of those hearings, and it appeared from the files for the other four hearings that some extension of time had been granted, but OSEP could not determine whether a decision had been reached and mailed to the parties within specific extensions of the time line."[181]
Sometimes violations of the 45-day requirement result in delays that can waste a significant portion of a school year for the students.
In Georgia, "OSEP found that in 12 of the 28 requests for a due process hearing, the 45- day time line was exceeded, and there were no requests for extensions recorded in the log prepared by Georgia Department of Education. The time lines in these cases exceeded the 45-day time lines in amounts ranging from seven days to four months and 27 days. The log noted that of the 16 requests for which extensions were recorded, 10 were extended for a specific period of time. The log entries for the other six extensions did not include a specific time limit, and all were resolved from 56 to 169 days beyond the 45-day time line requirement."[182]
(vii) Protection in evaluation
Re-evaluations of students with disabilities must occur within three years of prior evaluations. Initial evaluations must comply with time line standards set by state regulations.

OSEP found that 19 states (38%) had failed to ensure compliance with the protection in evaluation requirements.[183] For example,

In Texas, "OSEP interviewed administrators and agency officials responsible for coordination and conducting evaluations in agencies A, B, H, J, and K to determine whether all students with disabilities are evaluated at least every three years, or more often if warranted or requested by the child's parent or teacher. These officials acknowledged that some evaluations were delayed by three to twelve months beyond the three-year time line. They reported to OSEP that there was a waiting list of students in each of these agencies whose re-evaluations were overdue. Administrators from agencies A and H informed OSEP that at least 100 students' re-evaluations were delayed. Administrators in agency B explained to OSEP that 1,244 overdue re-evaluations exceeded the three-year time limit. An agency J administrator explained to OSEP that of the three regions in the district, the northeast region had 265 overdue re-evaluations for students with disabilities that exceeded the three-year time limit."[184]

In Rhode Island, "OSEP reviewed student files from six agencies and found that some student re-evaluations were from one month to five years overdue. Agency D provided OSEP with a list of students whose re-evaluations were overdue. OSEP reviewed data for 77 of the students on the list: 10 were two to three years overdue, 19 were one to two years overdue, and 48 were a year or less overdue. A special education administrator in agency E told OSEP that evaluations were seriously delayed. Of 251 re-evaluations, 151 were overdue, some by as much as five years."[185]

"OSEP reviewed documentation on initial evaluations and interviewed staff in agencies visited. These agencies provided documentation on initial evaluations completed during the 1993-94 and 1994-95 school years. That documentation showed delays in evaluations conducted by public agencies that ranged from 10 instructional days to as many as 390 instructional days (e.g., greater than two calendar years) in the following agencies:

     Agency B--63 of 400 evaluations were overdue;

     Agency C--166 of 377 evaluations were overdue;

     Agency E--49 of 600 evaluations were overdue;

     Agency F--161 of 806 evaluations were overdue;

     Agency G--68 of 386 evaluations were overdue.

OSEP collected documentation from agencies B, C, D, E, F, and G on re-evaluations conducted during the 1994-1995 school year. In interviews, administrators and agency personnel responsible for conducting these evaluations reported that the following delays were the result of staff shortages and the subsequent decision to give priority to initial evaluations over triennial re-evaluations.

     Agency B--180 of 579 evaluations overdue

     Agency E--68 of 386 evaluations overdue

     Agency G--340 of 380 evaluations overdue

In agencies E and G, these re-evaluations were, in some cases, more than a year overdue."[186]

"[I]n one district in New York, DoED reviewed a district report and found that of 5,743 students referred for assessments during the 1992-93 school year, 3,467 (60%) were overdue."[187]

e. Data Quality Issues Raised by the Monitoring Reports
At the start of this section, several problems regarding the standards used in assessing the federal monitoring findings were laid out, pointing to the need for some fundamental changes in monitoring state compliance with IDEA. Issues of data quality will also play a pivotal role affecting collection and use of data under the new monitoring system. 
First, the 1997 reauthorization of IDEA placed a strong emphasis on results for students with disabilities and performance measures as indicators of the states' success in meeting the goals of IDEA

This priority emerged in part due to the second factor: the growing impact of the Government Performance and Results Act of 1993 (GPRA).[188] Aimed at improving the effectiveness of federal programs and public accountability, GPRA required federal agencies to prepare a five-year strategic plan and annual performance plans beginning with fiscal year 1999. Agency performance reports were also required, and the first report on FY 1999 is due in March 2000. The public accountability envisioned by GPRA extends to state or local government entities receiving federal funding. They are responsible to their respective funding agencies for GPRA compliance.

Under earlier provisions of IDEA, states had reported annually on their progress in implementing IDEA, but with significantly fewer quantitative data reporting requirements. Now states will have to report on all assessments of students with disabilities in the same detail and with the same frequency as on assessments of nondisabled students, for example. In order to meet the new reporting requirements, states will need to develop statewide goals, standards, and assessment systems for students with disabilities. States will also have to define the performance indicators and measures for determining if the performance standards are being met and have the systems in place to collect the data.

OSEP indicates that while many states have data collection and reporting systems in place, the systems vary tremendously. There is currently no requirement in IDEA for a standardized approach to data reporting, even for federal reporting purposes

OSEP has monitored state compliance based in large part on the type and quality of compliance-related data available in each state. Only some elements of this data are prescribed by law. The limited availability of assessment and compliance data that are both adequate and appropriate affects states' ability to ensure that school districts are providing FAPE, LRE, procedural safeguards, etc. to children with disabilities.

There is a need to have the right data available for assessing compliance with state and federal program requirements, while minimizing the burden on resources in collecting, analyzing, and reporting on that data. A comprehensive reassessment of all data required to evaluate the many state and federal education programs will help accomplish this. For example, the data elements needed to measure compliance with IDEA and improved educational results for children with disabilities should be identified in consultation with all stakeholders, including the students, their parents, public agencies, and policy-makers. These IDEA data elements should be compared with the complete list of data elements required for evaluating all of the various federal and state programs to determine where existing data sources in each state can be drawn upon, redundant data eliminated, and missing data developed.

OSEP's leadership is critical to helping states build and maintain the efficient data systems they need to assess their own performance in meeting their responsibilities under IDEA. OSEP can bring together the stakeholders and facilitate the process of identifying the appropriate data elements for assessing IDEA compliance and educational results indicators. Because reliable data is vital to effective general supervision by the states, the Department of Education also should provide technical assistance to them for developing comprehensive, streamlined data systems.

f. Findings and Recommendations

Finding # III B. 1A
After 25 years, all states are out of compliance with IDEA to varying degrees.
An analysis of the most recent federal monitoring report available for each state (from 1994-1998) indicated that no state had carried out its responsibilities to ensure compliance with all the requirements of Part B. While the degree of noncompliance with any given requirement (based on number and seriousness of infractions) varied among the states, many states had failed to ensure compliance with a significant number of requirements. Of the seven areas analyzed, 24 percent, or 10 states, had failed to ensure compliance in five areas; 24 percent, or 10 states, had failed to ensure compliance in six areas, and 12 percent, or six states, had failed to ensure compliance in seven areas. Four percent, or two states, had failed to ensure compliance in only one area.

Finding # III B.1B
More than half of the states have failed to ensure full compliance with the following areas: general supervision (90%, or 45 states); transition (88%, or 44 states); free appropriate public education (80%, or 40 states); procedural safeguards (78%, or 39 states) and least restrictive environment (72%, or 36 states).
Other areas in which states failed to ensure compliance are IEPs (44%, or 22 states) and protection in evaluation (38%, or 19 states).

Recommendation III B.1A
Congress should ask the General Accounting Office to conduct a study of the extent to which SEAs and LEAs are ensuring that the requirements of IDEA in the areas of general supervision, transition, free appropriate public education, procedural safeguards, and least restrictive environment are being met. In addition, the Department of Education should conduct regular independent special education audits (fiscal and program) initiated by the DoED Office of Inspector General (OIG). The purpose of the audits would be to examine whether federal funds granted under IDEA Parts B and D (State Program Improvement Grants) have been and are being spent in compliance with IDEA requirements. These audits should be a supplement to OSEP's annual compliance monitoring visits, and the audit results should be in DoED's annual report to Congress. To the extent that the DoED OIG lacks the subject matter expertise to conduct program audits under IDEA, the OIG should contract with independent entities having such expertise when a program audit is necessary.

Recommendation # III B.1B
Congress should fund an independent consortium of nongovernment entities in every state to develop and conduct independent monitoring and to produce independent reports to the President and Congress on the status of each state's compliance with IDEA at the local level. Members of the nongovernment consortium should include, but not be limited to, the state's PTI, P&A, and IL centers.
While parents of children with disabilities and students and adults with disabilities participate in the federal monitoring process, they have no independent means for assessing the extent or quality of state compliance, for determining why state failure to ensure compliance persists, and for communicating these findings to the President and Congress. They need to be able to provide reliable and regular assessments of their state's compliance with IDEA, as well as a realistic picture of the toll of noncompliance on children and families in their state, to federal and state leaders, and to the public at large.

Finding # III B.2
OSEP did not have an explicit objective standard for assessing whether noncompliance with IDEA requirements found in any given state was systemic.
OSEP staff indicated that a state was found noncompliant with a given requirement only if the failure to ensure compliance was "systemic," (i.e., observed by monitors "with some frequency").[189] For example, a finding of noncompliance could have meant that out of 10 schools monitored, anywhere from three to 10 had failed to ensure compliance with a given requirement. There was no established standard (quantitative or qualitative) by which OSEP made a determination that noncompliance was systemic.

Recommendation # III B.2A
The Department of Education should establish and use national compliance standards and objective measures for assessing state progress toward better performance results for children with disabilities and for achieving full compliance with IDEA.

Recommendation # III B.2B
OSEP should work with the states, students with disabilities, their parents, and other stakeholders to identify the core data elements needed to assess whether compliance standards are being met and performance results for children with disabilities are improving statewide.

Recommendation # III B.2C
OSEP should closely monitor state progress in developing reliable data collection and reporting mechanisms (qualitative and quantitative) that adequately and accurately assess both state compliance and performance results for children with disabilities. This recommendation coincides with a central goal of the 1997 IDEA reauthorization to focus IDEA implementation more closely on objective performance standards and results measures.

Recommendation # III B.2D
OSEP should make as its own compliance monitoring priority for the next five years the assessment of state progress toward creating reliable and comprehensive data (quantitative and qualitative) to support effective state compliance monitoring capabilities.

Finding # III B.3
OSEP's monitoring reports did not clearly indicate which IDEA requirements were monitored, why they were monitored, and what the compliance status was.
OSEP reported placing "a strong emphasis on those requirements most closely associated with positive results for students with disabilities,"[190] and appeared to monitor a stable core of requirements in every state. It used information gathered during the pre-site process to help determine what to monitor.
Federal monitoring reports, however, did not display all the requirements monitored, nor did they consistently specify the requirements with which the state appeared to comply, based on the sample of districts, student files, interviews, and state policies and procedures, as well as state monitoring documents reviewed. In some cases, requirements with which the state appeared to comply were mentioned in report cover letters, and in other cases they were not. Therefore, it was not always possible to determine all the requirements monitored and the compliance status of each.

Recommendation # III B.3
All OSEP monitoring reports should consistently state what requirements were monitored, the rationale for choosing those requirements, which ones were in compliance, and which ones were out of compliance.
Such reporting would have enabled a comparison between reports and over time. It also would have enabled an understanding of where states were determined definitively to be in compliance, which might have offered opportunities for positive acknowledgment.

Finding # III B.4
OSEP monitoring did not include observation of students; rather, it involved collecting and reading documents and interviewing education personnel.
In the experience of OSEP staff, observing students consumed a great deal of time and often did not yield enough conclusive data to make clear-cut compliance determinations. Many parents and advocates criticized the monitoring process, however, as one that focused too much on talking with education personnel and reading documentation. Their concern was that this approach did not provide an adequate measure of the extent to which students were being appropriately served.

Recommendation # III B.4A
OSEP's monitoring process in each state should routinely include an ethnically diverse sample of children who are matched to their records and who are interviewed, along with their parents and service providers, for a determination of whether the law's requirements are being met on their behalf.
Routinely including interviews with children from ethnically diverse backgrounds, their parents, and service providers in the monitoring process would have provided a more grounded understanding of the states' compliance picture.

Recommendation # III B.4B
OSEP should review the files of more students placed in out-of-state residential facilities, and increase the number of compliance monitoring site visits to separate public and private facilities, as well as to state schools for students who are deaf or have visual impairments.

Finding # III B.5
A complete historical inventory of all monitoring reports issued for every state is not available, but since 1990 all reports issued have been maintained.
The historical monitoring data in these early reports were crucial to understanding what areas had remained chronically out of compliance and how states had progressed in improving compliance over time. In addition, an analysis of the historical data could have provided insight into the impact of corrective action plans on reducing noncompliance.

Recommendation # III B.5
OSEP should undertake efforts to construct a database with all monitoring reports, corrective action plans, and compliance agreements ever issued by OSEP, to standardize all newly issued reports, plans, and agreements and capture in the database, and to undertake a historical analysis of compliance for each state.
A historical picture of each state's compliance status will greatly inform OSEP's monitoring work and allow for examining trends over time. In addition, it will provide a sense of the persistence of certain problems in particular states.

Finding # III B.6
Important IDEA requirements appeared to be unmonitored or under-monitored.
The federal monitoring reports examined from all fifty states showed that compliance with one important requirement appeared not to be monitored, and compliance with another appeared to be under-monitored.
IDEA required states to have "[p]rocedures for adopting, if appropriate, promising practices, materials, and technology, proven effective through research and demonstration."[191] There was no evidence in the texts of the monitoring reports reviewed that compliance with this requirement had ever been monitored.
SEAs are required to "ensure" that public agencies "ensure" that "[u]nless the IEP of a child with a disability requires some other arrangement, the child is educated in the school that he or she would attend if nondisabled."[192] In the fifty reports reviewed, OSEP had made findings of noncompliance with this requirement in two states--North Dakota[193] and Utah.[194] Both reports were issued in 1994, the first year of reports reviewed. There was no evidence in the texts of the other monitoring reports reviewed that compliance with this requirement had been monitored.

Recommendation # III B.6
OSEP should ensure that every IDEA requirement is monitored in every state at regular intervals, even if not core requirements or not identified by the state as problem noncompliance areas.
OSEP should develop a method for ensuring that requirements often overlooked in the monitoring process are monitored at regular intervals. The compliance status of states with noncore requirements or requirements rarely identified as problem areas during the pre-site visit (i.e., implementation of promising practices) should be monitored at regular intervals in every state.

Finding # III B.7
OSEP frequently took too long to issue monitoring reports.
For reports issued between 1994 and 1998, the amount of time from the date the monitoring visit ended and the date of the final report was greater than 90 days for 45 states, greater than180 days for 27 states, and greater than 365 days for 12 states. DoED's present policy is to issue the report approximately five to six months (150-180 days) after the on-site visit, but recognizes the need to get the reports out more quickly. OSEP has requested additional staff, and is working on a new strategy to reduce lag time before the release of each monitoring report.

Recommendation # III B.7
OSEP should issue the monitoring report as soon as possible after the site-visit, preferably within 60 days (two months).
OSEP is requesting resources and working on a new strategy to issue the monitoring reports in more timely fashion. An issuance date no later than two months following the end of the end of the monitoring visit should be established.

Finding # III B. 8
The Department has been making monitoring reports available through the Department of Education's web site as soon as they are issued.
The most recent reports (or the report's executive summary) from 27 states have been made available on the OSEP web site. All new reports will be placed there in the future. Placing the reports on the web site will allow timely access for a broad range of stakeholders and a greater awareness of the monitoring issues in each state.

Finding # III B.9
The Department began implementing a new "continuous improvement" monitoring process where the state is a collaborator with the Federal Government and other constituencies to assess the educational success of students with disabilities and to design and implement steps for improvement on an ongoing basis.

Recommendation # III B.9
The Department should conduct a formal assessment of the new continuous improvement monitoring process within the next three years. The assessment should incorporate broad stakeholder input, particularly from students with disabilities and their parents, on the effectiveness of the new process in improving compliance with Part B and improvements in educational results for students with disabilities.

The following section presents an analysis of findings on areas of noncompliance reported in the last three monitoring reports for six states.

To Second File of Part III, Complaint Handling and Enforcement 

To TOC, IDEA Compliance Report


Email Newsletter icon, E-mail Newsletter icon, Email List icon, E-mail List icon The Special Ed Advocate: It's Free!

Order Wrightslaw
Products Today!



Check Out
The Advocate's Store!

Wrightslaw on FacebookWrightslaw on TwitterWrightslaw YouTube Channel 

Wrightslaw Books
Wrightslaw: Special Education Law, 3rd Edition, by Pam and Pete Wright
About the Book

Wrightslaw: From Emotions to Advocacy, 2nd Edition
About the Book

Wrightslaw: All About IEPs
About the Book

Wrightslaw: All About Tests and Assessments
About the Book

Wrightslaw: Special Education Legal Developments and Cases 2019
About the Book

Surviving Due Process: Stephen Jeffers v. School Board
About the DVD Video


The Advocate's Store


Understanding Your Child's
Test Scores (1.5 hrs)

Wrightslaw Special: $14.95