INSTRUCTIONS TO AUTHORS
Conditions for Publication
Conditions of publication in Psykhe
1. Articles must be original and unpublished. Authors should indicate expressly that the article is not being simultaneously submitted for review at another publication. In cases where the data or information has been previously published, the authors must explain in the letter sent with the manuscript the differences between the already published article and the one sent for review.
Following the COPE Best Practice Guidelines for Journal Editors (http://publicationethics.org/resources/guidelines), Psykhe does not tolerate any form of plagiarism.
2. Articles must be written in Spanish or English, with a maximum of 8,000 words (not including abstract, keywords, author´s note, references, tables, figures, and appendixes).
3. Articles that primarily report measurement development or effective research methodologies might be sent to be considered for publication in the section Methodological Developments. Specific guidelines for articles addressing the validation of an instrument can be found later in this document.
4. Articles that report circumscribed results with a minor conceptual framework might be sent to be considered for publication as Brief Reports. Their extension may not exceed 3,500 words (not including abstract, keywords, author´s note, references, tables, figures, and appendixes).
5. Psykhe accepts submissions of manuscripts reviewing the scientific literature. Reviews considering a narrative style will be accepted, as well as systematic reviews and meta-analyses, depending on the nature of the problem addressed. In any case, reviews must make a clear contribution to the field. Reviews will be encouraged to include (but not necessarily to be limited to) research conducted in the Ibero-American context. Also, articles that address original and regionally relevant topics will be prioritized.
Various types of reviews will be considered for publication, including:
6. Articles must be double–spaced, written in 10-point Century Schoolbook.
7. Articles must be written according to the norms established by the American Psychological Association Publication Manual, 6th edition (2010).
8. Articles must include:
9. Articles must conform to current ethical principles concerning research with human beings and animals. The article must explicitly report the process of informed consent by participants, whether it has been approved by an ethics committee, and any other relevant aspect for ethical research. For further information, consult: http://www.apa.org/ethics/code/index.aspx.
10. Tables and figures must follow APA Publication Manual (6th edition) guidelines. Each table and figure must be placed in separate pages at the end of the manuscript. Each table and figure must be mentioned in the text and numbered consecutively in the order in which they are mentioned.
11. Articles must be submitted in the website of Psykhe (www.psykhe.cl) using one of the following file formats: .odt, .doc, .docx, or .rtf.
For further information please download the following document Detailed Instructions for the Preparation of Articles.
About validation of instruments
Psykhe publishes studies evaluating the psychometric properties of different instruments in its section on Methodological Developments. These publications represent a fundamental contribution to the development of psychological research, as well as its diverse professional applications. Its relevance is even greater considering our regional context, because many instruments have been developed and validated for different cultural contexts and thus require further evaluation for its use in Ibero-America.
Considering the importance of this type of work, this document presents a series of recommendations focused on promoting the generation of articles of the greatest quality, both in technical terms and in their contribution to our discipline. These recommendations are based primarily on the work developed in:
American Educational Research Association, American Psychological Association, National Council on Measurement in Education & Joint Committee on Standards for Educational and Psychological Testing. (2014). Standards for Educational and Psychological Testing. Washington, DC: AERA.
This book defines validity as the degree to which evidence and theory support the interpretations of test scores for proposed uses of tests, something that has a series of implications for the development of publications focused on studying the validity of standardized measurements of psychological constructs. Such publications should develop an argument based on multiple sources of evidence in order to justify these interpretations, including for example a conceptual analysis of the items and their relation to theoretical constructs, the response processes of participants, the internal structure of the test, and the relations between the measure and other relevant variables.
These publications should also clarify the proposed interpretations of test scores, and the specific uses that are being evaluated. In general, specific uses that have relevant consequences for people’s lives—such as diagnostic tools or tests designed for personnel or student selection—require more robust evidence in order to justify the decisions that will be made based on the test scores. The development of these instruments should also be integrated with current literature, and considerations regarding the relevance of this literature for their proposed use and setting.
1. Evidence based on internal structure
Most of these studies use some analyses that are based on participant’s response patterns to the different items, specifically exploratory and/or confirmatory factor analysis. These are complex statistical tools, and they are frequently used incorrectly (in no small part because the default options in popular statistical packages can be problematic). The following articles address the most frequent errors when using these analyses, and also provide concrete recommendations:
Russell, D.W. (2002). In Search of Underlying Dimensions: The Use (and Abuse) of Factor Analysis in Personality and Social Psychology Bulletin. Personality and Social Psychology Bulletin, 28(12), 1629-1646.
Bandalos, D. L. & Boehm, M. R. (2009). Four Common Misconceptions in Exploratory Factor Analysis. In C. E. Lance & R. J. Vandenberg (Eds.), Statistical and Methodological Myths and Urban Legends: Doctrine, Verity and Fable in the Organizational and Social Sciences (pp. 61-87). New York: Routledge.
2. Evaluation of the reliability of measurements
The great majority of articles using an average or a sum of multiple items to measure a specific psychological construct report an index of internal consistency—typically Cronbach’s Alpha coefficient—in order to provide evidence that the responses to these items are internally consistent. Just as with factor analysis, the interpretation and use of these coefficients is frequently problematic.
Schmitt, N. (1996). Uses and Abuses of Coefficient Alpha. Psychological Assessment, 8(4), 350.
Cortina, J.M. (1993). What is Coefficient Alpha? An Examination of Theory and Applications. Journal of Applied Psychology, 78(1), 98.
It is also important to note that the concept of reliability is not limited to the internal consistency of a scale, and that there is a wide range of analyses that can evaluate the test’s ability to generate stable indicators of the measured construct.
3. Including other analyses
The evaluation of the validity and reliability of a standardized measurement is not limited to factor analysis and analysis of its reliability. As we mentioned at the beginning of this document, establishing an argument in order to justify the interpretation of test scores also requires other sources of evidence. For example, the translation of an already existing scale should report the translation process in some detail, and possibly include qualitative studies regarding the response process and the way participants understood the items—for example, using cognitive interviews.
The relation between the test scores and other variables is also of fundamental importance, because these analyses can evaluate if the observed relations are consistent with theoretical expectations if the instrument is indeed measuring the proposed construct. This process may include the relation with other measurements of the same construct (convergent evidence), the relation with measures of different constructs in order to show that they are distinct (discriminant or divergent evidence), and the predictive power of the instrument regarding relevant outcomes (criterion predictive evidence).
The analysis of the reliability and precision of an instruments becomes increasingly important as its use has more relevant consequences for people and society in general. Establishing cut-points for diagnostic or selection purposes is a complex process, using methodologies and analyses that were not covered in this document, such as standard setting, evaluating the consistency of classifications, generating test norms for a specific population, etc. In general, the evidence required for a research article focusing on the validity and reliability of a test depends on the proposed use of this test, as well as the current state of the literature.
Other recommended reading:
Strauss, M.E. & Smith, G.T. (2009). Construct Validity: Advances in Theory and Methodology. Annual Review of Clinical Psychology, 5, 1-25.
Submission Preparation Checklist
As part of the submission process, authors are required to check off their submission's compliance with all of the following items, and submissions may be returned to authors that do not adhere to these guidelines.
Once an article is accepted, authors will sign a form for transferring the rights of the article to Psykhe.
The authors preserve the cpyright of their work but transfer their royalties to Psykhe.
The names and email addresses entered in this journal site will be used exclusively for the stated purposes of this journal and will not be made available for any other purpose or to any other party.
[Abouth the journal] [Editorial board] [Subscription]
Pontificia Universidad Católica de Chile
Facultad de Ciencias Sociales
Escuela de Psicología
Av. Vicuña Mackenna #4860
Santiago - Chile
Tel.: (56-2) 2354 4844
Fax: (56-2) 2354 4844