User talk:Sennadacosta

 Hello Sennadacosta, and welcome to Wikiversity! If you need help, feel free to visit my talk page, or contact us and ask questions. After you leave a comment on a talk page, remember to sign and date; it helps everyone follow the threads of the discussion. The signature icon in the edit window makes it simple. To get started, you may


 * Take a guided tour and learn to edit.
 * Visit a (kind of) random project.
 * Browse Wikiversity, or visit a portal corresponding to your educational level: pre-school, primary, secondary, tertiary, non-formal education.
 * Find out about research activities on Wikiversity.


 * Read an introduction for teachers and find out how to write an educational resource for Wikiversity.
 * Give feedback about your initial observations
 * Discuss Wikiversity issues or ask questions at the colloquium.
 * Chat with other Wikiversitans on #wikiversity-en.

And don't forget to explore Wikiversity with the links to your left. Be bold to contribute and to experiment with the sandbox or your userpage, and see you around Wikiversity! If you're a twitter user, please follow http://twitter.com/Wikiversity. --Mu301Bot 08:18, 7 March 2010 (UTC)

Feel free to leave a message on my talk page if you have any questions or need help. --mikeu talk 08:18, 7 March 2010 (UTC)

Relevance/RQ

 * Bibliometrics is a set of quantitative techniques applied to determine the influence of a writer, or a publication in relation to their citation records, number of articles and periodicity, currently being the standard of assessment for most of the scientific institutions in the world. On the other hand, the "network metrics" do not take in consideration factors such as bureaucratic requirements, institutional context and individual motivations, much less the impact of research to society. Furthermore, they are inaccurate to describe and understand core structures and processes of scientific authoring and publishing. Concerning the social responsibility of science, the present investigation asks: which kind of assessment would better represent and could foster quality on educational and scientific works?

Methods/Data

 * A survey (12 simple and multiple-choice questions) in combination with an in-depth interview (6 questions) applied in face-to-face meetings to 46 researchers of the School of Forest Engineering (CIFLOMA) at the Federal University of Paraná / Brazil.

Results

 * The scientists submit their texts almost exclusively to traditional publications with higher impact factors. The publication of articles has a compulsory, though exclusive character, and does not target the democratization or popularization of expert knowledge. Academic publishing is currently done almost exclusively to attend bureaucratic and administrative requirements. The testimony of professor Nivaldo Rizzi, who was dean at the UFPR, and dedicated five years of his career working as manager of the research and funding policy of the Parana State, addresses directly the issue: “Our institutions work with the idea of "knowledge dominance", and not with the principle of information sharing, or collectivization of knowledge... it is a competitive standard among individuals and institutions. The current policy regarding research evaluation reinforces this obsession for a competitive edge, it does not try to change this vicious circle.

Added Value

 * Results showed clearly that there is an urgent need of a policy guideline to counter the current academic culture. Instead of new “metrics”, we should start seeking for new social norms in order to qualify and foster scientific collaboration.

Peers' Comments
1) Thank you for the many details in your abstract. Well done. I missed information about the sampling procedure. What was your qualitative approach in data analysis? Did you follow a specific methodology? I also do not have the feeling that the results presented in the abstract were unexpected. In so far it remained unclear how the interviews add to existing knowledge. The problem is well known and the discussion about this is on-going in Germany. Although, I like the approach to unravel more details about the problem with this mixed methods approach. Could you find any suggestions on how to tackle the problem? Did you identify any starting point? Instead of "start seeking" I would expect that your research is already seeking answers and I would like to hear more about possible answers/solutions/alternatives in the talk.

2) This paper could be submitted at many scientific conferences. Even though I do not really understand the specific value for General Online Research. I think it is relevant and even if the sample is locally limited the paper is a good impulse to promote the discussion on this issue.

3) Relevance/RQ: Highly relevant topic on the actual practices of science and the structures and conditions of academic publishing. It would be useful to make clear connection to the chosen headline indicating the relevance of commoditizaition of scientific knowledge and discussed concepts of open science, open access publishing, data sharing, international knowledge exchange, research collaboration. Furthermore, as you mention the need to shift from mere quantitative measures to an inclusion of qualitative criteria the concept of "governance by numbers" may provide a valuable theoretical grounding for your work. Results: Whereas the RQ asked for a better approach, the results as indicated in the abstract predominantly show current state of practices. The question remains, why do you critically see a need to change these practices, and how can researchers put your approach into practice? Added Value: Since your results based on interviews in a one single discipline (School of Forest Engineering) It would be valuable to depict in which way and to what extent your findings can be transferred to, and in this sense are being generalizable for other scientific disciplines.

--141.20.126.222 (discuss) 15:27, 6 February 2014 (UTC)