Managing Tests

The Tests tab enables you to associate tests with the publication and specify various test-related attributes.

When you’re specifying the test parameters, fields with asterisks (*) are mandatory. If a field value is retrieved directly from FlightPlan, that field is grayed out.

Managing tests involves the following tasks:

When you’re viewing a test, you can view a summary of your test parameters by clicking the Summary tab. The test attributes specified in all the earlier tabs are displayed.

Add and Remove Tests

  1. Click the Tests tab on a publication.

  1. To add tests to your publication, click Add\Remove tests to Publication. The Add Tests to Publication window appears.

  1. From the Test Family dropdown list, select a test family. The available tests are displayed.

    While you can add tests from different test families to your publication, you can only select tests from one test family at a time.

  1. Mark the checkboxes for the tests you want to add to the publication and click Save. A pop-up appears, confirming that the tests have been successfully selected.

    If you select a multi-segmented test, each segment is listed as a separate test to enable you to specify test parameters for each segment.

  1. Click OK to return to the Add Tests to Publication window. To add more tests or to remove selected tests, continue to step 6. Otherwise, skip to step 7.

  2. Do one of the following:

    • To add tests from a different test family, repeat steps 35.

    • To remove a selected test, clear the checkbox for that test, click Save, then click OK on the confirmation message.

  1. To close the window, click the close button or Cancel. The publication’s Tests tab now lists the selected tests.

  1. Optional: To add more tests or remove tests, click Add\Remove tests to Publication and repeat steps 37 above.

  2. Optional: To sync ITS with FlightPlan and retrieve the latest test lists, click Refresh test data from FlightPlan.

  3. Optional: To copy forms from older test publications, do the following:

    1. Click Copy Data from old test. The Copy data from old test window appears, listing all the tests associated with the publication.

    2. Optional: To copy data from tests for a different year, make a selection from the dropdown list at the top and click Set Year.

    3. To copy data from an old test for an associated test, from the dropdown list beside the associated test, select the test from which you wish to copy data.

      Repeat this step for all the tests for which you wish to copy old data.

    4. Click Save. A pop-up appears, asking you to verify that you wish to copy old data.

    5. Click OK to continue. A confirmation message appears, indicating that the data have been copied successfully.

    6. Click OK to return to the Tests tab of the publication page.

Edit Basic Test Attributes

  1. From the Actions column, click Edit for a test. The page for the selected test appears, open to the Attributes tab.

  2. Select or enter values for the available attribute fields as applicable.

    • From the Selection Algorithm dropdown list, select the required algorithm. For example, select fixedform for tests that have fixed questions. If questions should be selected based on students’ responses to items, select the relevant algorithm.

  3. Click Save. Additional test parameter tabs are displayed. Note that the tabs that are available may vary depending on the selected algorithm.

Edit Test Specification Attributes

  1. Click the Specifications tab to edit the test specification attributes.

    • Specify or enter the psychometric attributes that are used by TDS for configuring the item selection algorithm in the available fields/dropdown lists and click Save. Some fields are present only on adaptive tests. Attributes marked with an asterisk are required, but you may enter placeholder/dummy data, such as 0 or 1 as permitted, when the reporting scale is not available yet or when the adaptive algorithm is not implemented on any part of the test configuration. For an overview of the attributes, refer to Table 8. Also note that clicking certain fields and dropdown lists opens pop-ups with more information.

    • The information that you enter in this tab should come from the Scoring Specifications document for the corresponding test.

  2. Click Save.

Table 8: Overview of Test Specification Attributes

Attribute

Description

Fixed Form

Adaptive

Included Stimuli

Specifies whether stimuli are included in the test.

Depends whether we have stimuli

Depends whether we have stimuli

Start Ability

The start ability is only used if there are no historical data that can be used for an adaptive test. Start ability should be specified in the scale score metric. The unscaled value (<ability>−intercept) ∕slope should fall between −6 and 6 on the theta scale.

Not applicable

Applicable

Start Information

When this field is not applicable, enter 0.2 as a placeholder/dummy value.

Not applicable

Applicable

Slope

The slope and intercept of the linear transformation of the theta score to the scale score. The slope value cannot be 0. When there is no transformation or the transformation is not known yet (for example, a field test), use a slope of 1 and an intercept of 0 (that is, the identity transformation).

Applicable

Applicable

Intercept

Applicable

Applicable

Weight

The value must unscale by (<ability>−intercept)∕slope to a valid logit value (~−6 to +6).

Not applicable

Applicable

Cut Ability

Can be the cut score for proficient in reporting scale.

Not applicable

Applicable

Lambda Multiplier

This field sets the relative weight of one strand compared to others in computing the match to ability.

Not applicable

Applicable

Cset1 Size

Size of candidate pool of the items based on contribution to blueprint match.

Not applicable

Applicable

Cset1 Order

Method used to order Cset1. This is only applicable to adaptive tests and test segments. The default value is ABILITY, which will be used if nothing is selected.

Not applicable

Applicable

Cset2 Random

Size of final candidate pool of the items from which to select an item randomly.

Not applicable

Applicable

Cset2 Initial Random

Size of candidate pool of the items based on contribution to blueprint match for the first item or item set selected.

Not applicable

Applicable

TDS Test Prefix

Formerly used to customize test names, but this is now done in FlightPlan without using prefixes. The field cannot be edited and will typically read “NA”.

Already defined in FlightPlan

Already defined in FlightPlan

Stat Domain

The stat domain is used when there are multiple sets of parameters for an item. If you do not select an option from this dropdown, the item stats for a form may not appear properly.

Applicable

Applicable

Item Weight

For adaptive tests, this field specifies the blueprint scalar for items in the final selection to determine which ones may need to be filtered out.

Not applicable

Applicable

Ability Offset

This is used to make the adaptive ability match harder (>0), easier (<0) or not used (=0). Must be on a logit scale (~−6 to +6).

Not applicable

Applicable

Ability Weight

This is used to put emphasis on ability match as compared to the blueprint match. Default is 1.

Not applicable

Applicable

Rc Ability Weight

The priority weight associated with reporting category information.

Not applicable

Applicable

Precision Target Met Weight

Overall information weight when the information target has been hit.

Not applicable

Applicable

Precision Target Not Met Weight

Information weight when the precision target has not yet been hit.

Not applicable

Applicable

Termination Too Close

When set to Yes, terminates if you are not sufficiently distant from the specified adaptive cut.

Not applicable

Applicable

Termination Overall Info

Can be Yes or No. Determines whether to use the overall information target as a termination criterion.

Not applicable

Applicable

Termination RC Info

Can be Yes or No. Determines whether to use the reporting category information target as a termination criterion.

Not applicable

Applicable

Termination Min Count

Can be Yes or No. Determines whether to use minimum test size as a termination condition.

Not applicable

Applicable

Termination Flags And

Can be Yes or No. Determines whether the other termination conditions are to be taken separately or conjunctively.

Not applicable

Applicable

Test Label

The label used to identify the test. This field cannot be edited.

Already defined in FlightPlan

Already defined in FlightPlan

Set Difficulty to 0 Where Missing

Set to Yes to ignore when items have missing difficulty values. Items that don’t have difficulty values will get the value 0.

Not applicable

Applicable

Compute Segment Ability?

This dropdown list tells TDS to estimate the starting ability of the next segment based on the results of this segment.

Not applicable

Applicable

Precision Target

Target information for the overall test. Required for adaptive segments when the new AA is being used.

Not applicable

Applicable

Ability Test ID

The test ID of another test that can supply a starting theta value. If this is set, then the slope and intercept values collected from ITS at the test level will be applied whether the score comes from a previous opportunity in the same year or from a previous year.

Not applicable

Applicable

Score Card Layout

This dropdown list allows you to hide all item content on the screen from the user, so that the only things they can view are an item position number and the response interaction. Select Yes to enable score card layout.

Applicable

Applicable

Initial Ability Years

This dropdown list defines how far back the adaptive algorithm will look when trying to estimate a student’s initial ability level.

  • 3: This year + 2 historical years (default).

  • 2: This year + 1 historical year.

  • 1: Only this school year.

  • 0: No historical data will be used, not even from the current year.

Not applicable Applicable

Edit Test Blueprint Attributes

Click the Blueprint tab to view the blueprint attributes.

Specify the parameters and click Save. Table 9 lists the available attributes, which vary depending on certain selections.

Table 9: Overview of Test Blueprint Attributes

Attribute

Description

Set of Grades A comma-separated list of the grades this test can be administered to.
Field Test Algorithm

Field test items are selected randomly from the pool of field test items until the total number of selected items (or total LOE of selected items, as described below) has reached the specified number. Three field test item selectors are available:

  • FT1 (sometimes called “Legacy”) is the legacy selector that pulls each field test item randomly in accordance with its relative frequency specified in the config (or equal frequency if nothing is specified). FT1 requires that item passages be subdivided into sets (called blocks). These blocks of items are administered as a complete group. The order of items within a block is also predetermined by the config.

    • The FT1 selector can utilize total field test item count or total FT LOE value of administered items as termination criteria. If item count is used for termination, field test items are selected on the fly as the student goes through the test. If LOE is used for termination, field test items are selected prior to the start of a test. Selected field test passages are removed from the operational pool if the passage contains a mix of field test and operational items.

  • FT2 (previously called “New”) overlays an adaptive selector onto field test item positions within the segment. Field test items can be assigned blueprint values if non-trivial field test affinity groups are specified and aligned to field test items. The FT2 selector uses an adaptive algorithm to order item groups based on their blueprint value. It takes the top designated number of item groups, and because information value for all items is unknown, it proceeds to select from the remaining set randomly with uniform probability. Just like in operational adaptive algorithm, after blueprint checks pass, each remaining passage (note that discrete items are each assigned a one-item passage with an equivalent name) has an equal chance to be administered in each field test position. Passage attributes MaxItems and NumResponses are used to determine the maximum and minimum number of items from the passage that can be administered at once, respectively.

  • FT3 works similarly to FT2, with the added feature of allowing the ITS user to change the default relative frequency of passages and discrete items. FT3 executes the blueprint satisfaction step first. It then chooses from the remaining set randomly with weighted distribution so that the relative frequency of passages and items matches the values specified in the config.

Fa Cset 1 Order When selecting FT items, this is the method used to order Cset1. This is only applicable to adaptive tests and test segments. The default value is DISTRIBUTION, which is used if nothing is selected.
Fa Cset 2 Random When selecting FT items, this is the size of the candidate pool of the items based on contribution to blueprint match for the first item or item set selected.
Fa Cset 1 Size Defines the number of items to include in the Field Test Cset1. If no value is entered here, the value entered for the (operational) Cset 1 Size attribute will be used.
Fa Cset 2 Initial Random Defines the size of the Field Test Cset2 (final candidate set) for the first field test item or set selected, based on the contribution the item makes to meeting all the blueprint requirements. If no value is entered here, the value from the Fa Cset1 Size attribute will be used.
Points Determine Form Length

For fixed-form tests only, specifies what determines the length of the test.

  • Yes: The length of the form will be determined by the points value specified below.

  • No: The length of the form needs to be between the Min Items and Max Items values specified below, inclusive.

Min Points The minimum sum of the points for all items on each form on the test.
Max Points The maximum sum of the points for all items on each form on the test.
Min Items For fixed-form tests, this is the number of items on the smallest form. For adaptive tests, this is the minimum number of items required to be administered on a test.
Max Items For fixed-form tests, this is the number of items on the largest form. For adaptive tests, this is the maximum number of items that may be administered on a test.
Is Maximum a Strict Constraint Defines whether the adaptive algorithm can exceed the maximum number of items in certain circumstances in order to satisfy other blueprint requirements.
FT Min Items The minimum number of field test items to be administered.
FT Max Items The maximum number of field test items to be administered.
FT Start Pos The first position on the test in which a field test item can be administered. For a test that contains only field test items, this value should be 1.
FT End Pos The last position on a test in which a field test item can be administered. For a test that contains only field test items, this value should be MaxItems.
Min LOE The minimum Level Of Effort for operational items.
Max LOE The maximum Level Of Effort for operational items.
FT Min LOE The minimum Level Of Effort for field test items.
FT Max LOE The maximum Level Of Effort for field test items.
Virtual Test Name Read-only field. The name of the virtual test this segment belongs to.
Segmented Test Position Read-only field. The number of this segment in a virtual test.
Is Initial Segment Determines whether this is the first segment of a multi-segmented test.
Is Final Segment Determines whether this is the last segment of a multi-segmented test.
Transition Functions The Transition Functions attribute in a segment state defines the rules for moving from one state to another within an adaptive test. It includes conditions based on scores or score ranges, with an optional Modulo condition to filter eligible transitions further. The syntax specifies which target state a student should be routed to based on their score upon completing a segment. For example, a rule like 5,10:TargetStateA; would route students with scores between 5 and 10 to TargetStateA. Modulo conditions allow the system to route only certain students to specific paths, achieving targeted distribution across different routes in the test.
Segment State Specifies the segment state that this segment belongs to. A segment state in an adaptive segmented test is a group of test segments treated as a single “state” within the student’s progression through the test. Each segment within a segment state must have the same values for each of these attributes: Transition Functions, Is Initial Segment, and Is Final Segment. When the student completes a segment, the system evaluates conditions (such as scores) to determine if a transition to a new segment state is needed. Segment states streamline the test structure, allowing multiple segments to be managed as a unit with common criteria for progression.
Sigma Function

The Sigma Function attribute in a segment state defines how the score is calculated for a test segment. This definition is used in determining the next segment to be given to the student. Sigma Function options include the following:

  • Local/Global Raw Score: Counts total points awarded in a single segment (Local) or across all segments visited so far (Global).

  • Local/Global FT Score: Similar to Local/Global Raw Score, but includes both operational and field test items.

  • Local/Global No Response: Counts unanswered items in the segment or across all segments.

  • Theta Score: Estimates the student's ability as a floating-point value.

  • ETS Field Test 2017: Counts the number of segments that the student has already completed. Whereas the other sigma functions count particular scores or response options, the “score” of the student when using this option is the count of the segments the student has been routed to.

Set Virtual Test Length

The Virtual Test tab appears only for a test that is a segment in a virtual test, as configured in FlightPlan. The Configuration team estimates virtual test length in this tab so that it can be displayed in the Test Delivery System (TDS).

To complete this tab, enter the estimated number of items in the Virtual Test Length field and select Save.

Configure Test Blueprint Constraints

Click the Constraints tab to view the Blueprint Constraints page. Do the following to configure the content standards that are going to be tested.

  1. From the CSR Publication used for this test dropdown list, select the content standard publication you wish to use for the test and click Set. For more information, refer to Managing Content Standard Publications.

  2. A message appears, indicating that the content standards for the selected publication will be displayed. Click OK to close the message.

  3. Add constraints directly through the user interface or by uploading a file.

    • To add blueprint constraints directly in the Constraints tab:

      1. From the Active column, mark the checkboxes for the standards that you wish to test.

        • To view the child entities of a particular content standard, click Expand button beside it.

        • To view content standards for all the levels, click Expand All. To collapse the hierarchy and view only the Level 1 content standards, click Collapse All.

        • To filter content standards, enter text in the Filter field. You can hide all nodes that don’t match your filter text by marking the Hide unmatched nodes checkbox. To clear filters, click x.

      2. Enter the maximum and minimum number of items to be included for the selected content standards, and the weight of those items.

      3. In the Strict Max column, mark the checkbox if the maximum number of items should be strictly enforced.

      4. Click Save Changes. A message appears, indicating that the constraints have been successfully saved.

      5. Click OK to close the message.

    • To import constraints:

      1. Click Import Constraints. A window to your computer opens.

      2. Next, select the constraints file that you have saved on your computer and click the appropriate button to upload the file. A table displays the constraints that are being updated.

      3. Click Submit to import the constraints. A confirmation message appears, indicating that the constraints have been successfully imported.

      4. Click OK to close the message.

  1. Optional: To export the configured constraints to an Excel file, click Export Constraints.

Configure Test SOCKs

Click the Socks tab to view the Blueprint Socks page. To configure the SOCKs, follow steps 3 and 4 in the section Configure Test Blueprint Constraints, except with reference to SOCKs rather than blueprint constraints.

Edit Fixed Forms on Test

Click the Forms tab, if available. It lists all the fixed forms associated with the test. This tab is only displayed for forms that use the fixedform selection algorithm.

Edit Test Item and Stimulus Attributes

  1. Click the Item Pool tab, which displays a list of all the items and stimuli associated with the test along with their attributes, such as item position, item status, and item weight if applicable.

  2. If necessary, from the Select Form Key dropdown list, select a form. Form options are only available for tests that use fixedform selection algorithm; an adaptive test, such as a linear on-the-fly test (LOFT), has only a single form.

  3. In the Edit Item Association for Form section, manage the items associated with this particular test or form as follows:

    1. Select an item or items from the list. You can select multiple items by holding down Ctrl or Shift.

    2. Optional: To view a single selected item, click View Item. A new browser tab or window opens, displaying the View\Edit Item page.

    3. Specify attributes (such as an item’s active status, the block it should be associated with, and whether it is a field test item) using the available dropdown lists and fields. These attributes apply only to the associated test and form, and do not affect the items themselves. Any attributes you do not set will remain as is. Attributes include the following:

      • Role: Allows you to label items with the operational, field, or other roles. Although the item role doesn’t appear in the config, it will be used in bookmaps.

      • IsFieldTest: If this attribute is set to No, the item will require parameters.

      • IsActiveOnForm: Infrequently used. Overrides items’ active statuses.

      • Block: Used to separate items within a passage into different groups. Blocking is used to field-test items associated with a stimulus when there are more items on the stimulus than there are field test slots available per student in a LOFT. For example, you may set blocks in order to field-test a stimulus with one group of items (a block) and field-test the same stimulus with a different group of items (another block). Assign each item a block letter, such as A, B, or C. The alphabetical order of the blocks has no effect. Items that are not manually associated with blocks default to block A.

      • IsRequired: Controls whether items are skippable. It is set by default to Yes for discrete items and No for passage items. To make an item skippable, this attribute must be set to No. If the item is a passage item, the Num Responses field for the passage must be set to 0 (refer to step 5).

      • NotForScoringOverride: Used to override the NFS attribute for the item in ITS. If the item has an NFS value of No (or if the item has a blank value, meaning it is No by default) and the override is set to Yes, the item will not be scored.

    4. Repeat these steps as needed.

  4. Below the item attribute settings, click Save. If you switch to another tab without saving, your changes will be lost.

  5. In the Edit Stimulus Association for Form section, do the following:

    1. Select a stimulus or stimuli from the list. You can select multiple stimuli by holding down Ctrl or Shift.

    2. Optional: To view a single selected stimulus, click View Stimuli. A new browser tab or window opens, displaying the View\Edit Stimulus page.

    3. Specify stimulus attributes using the available fields. These attributes apply only to the associated test and form, and do not affect the stimuli themselves. Any attributes you do not set will remain as is. Attributes include the following:

      • MaxItems: The maximum number of items a student can receive from the listed passage. This value is only applicable for operational passages (that is, passages for which IsFieldTest is set to No) on adaptive tests.

      • Num Responses: The default value for this field is -1, which means the passage is required. If passage items need to be skippable, this field must be set to 0.

    4. Repeat these steps as needed.

    5. Below the stimulus attribute settings, click Save. If you switch to another tab without saving, your changes will be lost.

Manage Test Affinity Groups

Within a test, an affinity group is a collection of items. Each affinity group has a limit on how many of the items a student may receive. An affinity group is similar to a SOCK in that it allows for setting constraints on categories of items that are not standards, though it can be used for standards as well.

Here are some examples of how an affinity group might be useful:

You can add and/or edit affinity groups on a test using the Affinity tab on a test.

  1. Click the Affinity tab.

  2. Use the Affinity Key dropdown list to select an existing affinity group to edit, or set it to None to create a new group. Then edit the other dropdown lists and fields provided:

    • Name: The name of the affinity group.

    • Is Field Test: A toggle to indicate whether the items in this group are field test items.

    • Min Items: The minimum number of items a student must receive from the affinity group.

    • Max Items: The maximum number of items a student can receive from the affinity group.

    • Is Maximum a Strict Constraint: A toggle to indicate whether a student may receive more than the maximum number specified if necessary to satisfy other constraints on the test. (The minimum number specified does not affect this setting.)

    • Weight: The weight assigned to the affinity group. This weight gives a level of importance to the constraint.

  3. When you have made your edits, click on Insert/Save to save the affinity group.

  4. To define the items that will belong to this affinity group, click Insert/Save Criteria to open the ITS search window described in Searching for an Item or Stimulus. When you click Save Search Criteria in this window and then click OK in the confirmation pop-up, a list of the criteria appears below the drop-downs and fields.

To delete the current affinity group, click Delete and then click OK in the confirmation pop-up.

Edit Test Proficiency Levels

Click the Proficiency Levels tab. Update the cut scores for each proficiency level as described below and click Save Changes.