Next Issue
Volume 3, December
Previous Issue
Volume 3, June
 
 

J. Intell., Volume 3, Issue 3 (September 2015) – 3 articles , Pages 59-110

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Section
Select all
Export citation of selected articles as:
314 KiB  
Article
Interaction Effects between Openness and Fluid Intelligence Predicting Scholastic Performance
by Jing Zhang and Matthias Ziegler
J. Intell. 2015, 3(3), 91-110; https://doi.org/10.3390/jintelligence3030091 - 18 Sep 2015
Cited by 26 | Viewed by 9645
Abstract
Figural reasoning as an indicator of fluid intelligence and the domains of the Five Factor Model were explored as predictors of scholastic performance. A total of 836 Chinese secondary school students (406 girls) from grades 7 to 11 participated. Figural reasoning, as measured [...] Read more.
Figural reasoning as an indicator of fluid intelligence and the domains of the Five Factor Model were explored as predictors of scholastic performance. A total of 836 Chinese secondary school students (406 girls) from grades 7 to 11 participated. Figural reasoning, as measured by Raven’s Standard Progressive Matrices, predicted performance in Math, Chinese, and English, and also for a composite score. Among the personality domains, Openness had a positive effect on performance for all subjects after controlling for all the other variables. For Conscientiousness, the effects were smaller and only significant for Math. Neuroticism had a negative effect on Math grades. The effects of Extraversion on all grades were very small and not significant. Most importantly, hierarchical latent regression analyses indicated that all interaction effects between Openness and figural reasoning were significant, revealing a compensatory interaction. Our results further suggest that scholastic performance basically relies on the same traits through the secondary school years. However, importance is given to interaction effects between ability and personality. Implications along with limitations and suggestions for future research are discussed. Full article
Show Figures

Figure 1

143 KiB  
Article
Differences in Judgments of Creativity: How Do Academic Domain, Personality, and Self-Reported Creativity Influence Novice Judges’ Evaluations of Creative Productions?
by Mei Tan, Catalina Mourgues, Sascha Hein, John MacCormick, Baptiste Barbot and Elena Grigorenko
J. Intell. 2015, 3(3), 73-90; https://doi.org/10.3390/jintelligence3030073 - 14 Sep 2015
Cited by 18 | Viewed by 8404
Abstract
Intelligence assessment is often viewed as a narrow and ever-narrowing field, defined (as per IQ) by the measurement of finely distinguished cognitive processes. It is instructive, however, to remember that other, broader conceptions of intelligence exist and might usefully be considered for a [...] Read more.
Intelligence assessment is often viewed as a narrow and ever-narrowing field, defined (as per IQ) by the measurement of finely distinguished cognitive processes. It is instructive, however, to remember that other, broader conceptions of intelligence exist and might usefully be considered for a comprehensive assessment of intellectual functioning. This article invokes a more holistic, systems theory of intelligence—the theory of successful intelligence—and examines the possibility of including in intelligence assessment a similarly holistic measure of creativity. The time and costs of production-based assessments of creativity are generally considered prohibitive. Such barriers may be mitigated by applying the consensual assessment technique using novice raters. To investigate further this possibility, we explored the question: how much do demographic factors such as age and gender and psychological factors such as domain-specific expertise, personality or self-perceived creativity affect novices’ unidimensional ratings of creativity? Fifty-one novice judges from three undergraduate programs, majoring in three disparate expertise domains (i.e., visual art, psychology and computer science) rated 40 child-generated Lego creatures for creativity. Results showed no differences in creativity ratings based on the expertise domains of the judges. However, judges’ personality and self-perception of their own everyday creativity appeared to influence the way they scored the creatures for creativity. Full article
(This article belongs to the Special Issue Challenges in Intelligence Testing)
66 KiB  
Article
Why Creativity Isn’t in IQ Tests, Why it Matters, and Why it Won’t Change Anytime Soon Probably
by James C. Kaufman
J. Intell. 2015, 3(3), 59-72; https://doi.org/10.3390/jintelligence3030059 - 7 Aug 2015
Cited by 49 | Viewed by 24292
Abstract
Creativity is a part of most theories of intelligence—sometimes a small part and sometimes a large part. Yet even IQ tests that assess aspects of intelligence that supposedly reflect creative abilities do not actually measure creativity. Recent work has argued that intelligence and [...] Read more.
Creativity is a part of most theories of intelligence—sometimes a small part and sometimes a large part. Yet even IQ tests that assess aspects of intelligence that supposedly reflect creative abilities do not actually measure creativity. Recent work has argued that intelligence and creativity are more conceptually related than we have thought. In addition, creativity offers a potential way to counter issues of test bias from several different angles. That said, inherent difficulties in measuring creativity and inherent sluggishness in the test industry mean the odds are small that creativity will find its way into IQ tests as currently defined. However, there remain other potential possibilities in related fields. Full article
(This article belongs to the Special Issue Challenges in Intelligence Testing)
Previous Issue
Next Issue
Back to TopTop