Please use this identifier to cite or link to this item: https://hdl.handle.net/1822/50069

TitleDevelopment of a word reading test: identifying students at-risk for reading problems
Author(s)Chaves-Sousa, Séli
Santos, Sandra Cristina Silva
Viana, Fernanda Leopoldina
Vale, Ana Paula
Cadime, Irene Maria Dias
Prieto, Gerardo
Ribeiro, Iolanda
KeywordsRasch Model
Word reading
Assessment
At-risk readers
Reading problems
Issue date2017
PublisherElsevier
JournalLearning and Individual Differences
Abstract(s)The aim of this study was twofold. In Study 1, we described the development of four forms of a test of word read- ing (TLP – Teste de Leitura de Palavras) for elementary school children (grades 1 to 4), using the Rasch model. An initial pool of 142 words was selected and tested on 905 Portuguese students. Rasch analyses allowed the devel- opment of a shorter version of the test for each grade with adjusted values concerning reliability coefficients and item local independence. In Study 2 (n = 325), the classification accuracy of the TLP to identify at-risk students for reading problems was examined based on several indices. Results indicated that each test form of the TLP pre- sented overall satisfactory classification accuracy in identifying at-risk readers with a criterion of 0.80 to set the sensitivity levels.
TypeArticle
URIhttps://hdl.handle.net/1822/50069
DOI10.1016/j.lindif.2016.11.008
ISSN1041-6080
Publisher versionhttps://ac.els-cdn.com/S1041608016302680/1-s2.0-S1041608016302680-main.pdf?_tid=ceba4bb0-fd16-11e7-95e1-00000aacb35f&acdnat=1516366186_9eb2bba17517c51b1d4da5297070be4f
Peer-Reviewedyes
AccessOpen access
Appears in Collections:CIEC - Artigos (Papers)

Files in This Item:
File Description SizeFormat 
Chaves-Sousa et al Learning and Individual Differences.pdf284,01 kBAdobe PDFView/Open

Partilhe no FacebookPartilhe no TwitterPartilhe no DeliciousPartilhe no LinkedInPartilhe no DiggAdicionar ao Google BookmarksPartilhe no MySpacePartilhe no Orkut
Exporte no formato BibTex mendeley Exporte no formato Endnote Adicione ao seu ORCID