Top 10 Computer Science Universities In Usa

- 01.46

Top 25 Ranked Computer Science Programs with the Best Return on ...
photo src: www.bestvalueschools.com

Academic Ranking of World Universities (ARWU), also known as Shanghai Ranking, is an annual publication of university rankings by Shanghai Ranking Consultancy. The league table was originally compiled and issued by Shanghai Jiao Tong University in 2003, the first global ranking with multifarious indicators, after which a board of international advisories was established to provide suggestions. The publication currently includes world's overall and subject league tables, alongside independent regional Greater China Ranking and Macedonian HEIs Ranking. ARWU is regarded as one of the three most influential and widely observed university measures, alongside QS World University Rankings and Times Higher Education World University Rankings. It is praised for its objective methodology but draws some condemnation for narrowly focusing on raw research power, undermining humanities and quality of instruction.


Top 25 Ranked Computer Science Programs with the Best Return on ...
photo src: www.bestvalueschools.com


Maps, Directions, and Place Reviews



Global rankings

Overall

Methodology

Reception

ARWU is praised by several media and institutions for its methodology and influence. A survey on higher education published by The Economist in 2005 commented ARWU as "the most widely used annual ranking of the world's research universities." In 2010, The Chronicle of Higher Education called ARWU "the best-known and most influential global ranking of universities". EU Research Headlines reported the ARWU's work on 31 December 2003: "The universities were carefully evaluated using several indicators of research performance." Chancellor of University of Oxford, Chris Patten and former Vice-Chancellor of Australian National University, Ian Chubb, said: "the methodology looks fairly solid ... it looks like a pretty good stab at a fair comparison." and "The SJTU rankings were reported quickly and widely around the world... (and they) offer an important comparative view of research performance and reputation." respectively. Philip G. Altbach named ARWU's 'consistency, clarity of purpose, and transparency' as significant strengths. Whilst ARWU has originated in China, the ranking have been praised for being unbiased towards Asian institutions.

Criticism

Like all other rankings, ARWU has criticism. It is condemned for "relying too much on award factors" thus undermining the importance of quality of instruction and humanities. A 2007 paper published in the journal Scientometrics found that the results from the Shanghai rankings could not be reproduced from raw data using the method described by Liu and Cheng. A 2013 paper in the same journal finally showed how the Shanghai ranking results could be reproduced. In a report from April 2009, J-C. Billaut, D. Bouyssou and Ph. Vincke analyse how the ARWU works, using their insights as specialists of Multiple Criteria Decision Making (MCDM). Their main conclusions are that the criteria used are not relevant; that the aggregation methodology has a number of major problems; and that insufficient attention has been paid to fundamental choices of criteria. The ARWU researchers themselves, N.C Liu and Y Cheng, think that the quality of universities cannot be precisely measured by mere numbers and any ranking must be controversial. They suggest that university and college rankings should be used with caution and their methodologies must be understood clearly before reporting or using the results. ARWU has been criticised by the European Commission as well as some EU member states for "favour[ing] Anglo-Saxon higher education institutions". For instance, ARWU is repeatedly criticised in France, where it triggers an annual controversy, focusing on its ill-adapted character to the French academic system and the unreasonable weight given to research often performed decades ago. It is also criticised in France for its use as a motivation for fusing universities into larger ones. Indeed, number of publications or award winners will mechanically add as universities are grouped, independently of research (or teaching) quality.

Results

Alternative

As it may take much time for rising universities to produce Nobel laureates and Fields Medalists with numbers comparable to those of older institutions, the Institute created alternative rankings excluding such award factors so as to provide another way of comparisons of academic performance. The weighting of all the other factors remains unchanged, thus the grand total of 70%.

Subject

There are two categories in ARWU's disciplinary rankings, broad subject fields and specific subjects. The methodology is similar to that adopted in the overall table, including award factors, paper citation, and the number of highly cited scholars.


Top 10 Computer Science Universities In Usa Video



Regional rankings

Considering the development of specific areas, two independent regional league tables with different methodologies were launched.

Greater China

Methodology

Results

Source of the article : Wikipedia



EmoticonEmoticon

 

Start typing and press Enter to search