110 likes | 260 Views
How Can Customized IT System Support Qualitative Methods in Website Validation : Application for Visual Content Analysis. Josipa Selthofer , jselthofer@ffos.hr Tomislav Jakopec , tjakopec@ffos.hr Faculty of Humanities and Social Sciences University of J.J. Strossmayer Osijek.
E N D
How Can Customized IT System Support Qualitative Methods in Website Validation:Application for Visual Content Analysis JosipaSelthofer, jselthofer@ffos.hr TomislavJakopec, tjakopec@ffos.hr Faculty of Humanities and Social SciencesUniversity of J.J. StrossmayerOsijek
Introduction • Content analysis is a highly flexible research method that has been widely used in library and information science (LIS) studies with various research goals and objectives(White & Marsh 2006) • Visual content analysis is the most common qualitative method used in visual communication and mass media research. It is an empirical (observational) and objective procedure for quantifying recorded audio-visual (including verbal) representation using reliable, explicitly defined categories (values and independent variables) (Bell 2001; Bauer 2000) • As media of communication, websites and web pages are base for content analysis (Weare & Lin 2000), which was one of the first methodologies used in web analysis (Bates & Lu 1997), and it has been employed increasingly since, although not always in traditional way (McMillan 2000)
Data gathering phase of qualitative research method in visual communication studies on a website is extremely complex and time consuming • At the same time researcher should have a visual access to the web page that is being reviewed and a possibility to quantify data for given attributes • In this specific visual research, the most important thing for the researcher was to have an application that is organized in a way that allows the researcher a full visual control of a web page he is observing and the ability to mark and save his observations directly on screen • Since specific visual communication research project consisted of analyzing and validating visual elements in large amount of web pages (1017) it was difficult to conduct research manually • The aim of this paper is to present a customized system providing IT support in the process of quantitative data gathering. • Main research questions are: How can the customized IT support system enhance data integrity and reduce total research time, especially in data gathering phase? Why none of the existing IT tools available on the market is suitable for visual content analysis of web pages?
Available IT tools on the market • Browsing through the web and literature on this subject, it can be noticed that visual content analysis IT tools exist in two ways • First, as part of visual representation of gathered data they are tools for easier data interpretation (Machlis 2011) • Second, as a set of tools for gathering data while performing visual content analysis in data gathering phase, for example: The Qualitative Data Analysis Program (QDAP), ATLAS.ti, f4analyse software Main characteristics of IT tools above are: • if they are free or open source software, their performance is limited • if they are commercial software, they are expensive
Customized IT system requirements • Since specific research project, web application was built for, was to analyze visual graphic elements of faculty and university web pages across the Europe, web application should have had these parts: • List of faculties’ IP addresses sorted by affiliation to their university in particular county • List of attributes for visual content analysis of web pages sorted by categories and allowing validation of visual graphic properties by clicking • Screen where the particular web page analyzed could be immediately seen visually • Ability to save, change and export all the data obtained in the research easily • Ability to change all the attributes in web application at any moment and therefore adjust the research, if necessary
Customized Application • Web application for the specific research project was built using agile software development method on LAMP stack and is available online. It offers three main sections: list of websites to evaluate, visual representation of loaded website and list of attributes grouped by categories for quantifying data. Proposed customized IT tool allows data export to widely accepted MS Excel format for further data analysis • IT support was built using open source technologies: Linux Ubuntu distribution as the operating system, Apache as the web server, MySqlas Relational database management system, PHP as the programing language, HTML, CSS and javascript (jQuery framework) as the client side technology stack
Web application is deployed on the address http://oziz.ffos.hr/epub/JosipaDoktorat/ • User interface is in Croatian language • After successful login, a user gets a menu of items that allow then to view, insert, change, or delete all entities – these actions enable a researcher to administrate data he is validating • After defining data for validation, researcher in Validation page marks each specific property for specific page • On the left side of the page is a list of countries with sub list of universities in that country and most important sub-list of websites in universities • Website is a link • When a researcher clicks on that link, a page is loaded in the central part of the screen (using AJAX) and on the right side of the screen there is property list categorized by defined categories. Researcher now can analyze the web page he is viewing in the central part of the screen and mark each attribute by clicking on the given option of the particular property or by writing remarks. Using AJAX, application stores answers immediately to the database so that a researcher does not have to click additional save button • After validation, all data gathered in research are available for exporting. Export is obtained using comma separated value (CSV) file that can be easily edited by popular office tool MS Excel or imported in statistical software (like SPSS).
LIS student`s competencies and their applicability in building customized IT tools • Building a web application customized for visual content analysis research demands specific knowledge and competencies regarding design of a web page, Relation Database Management Systems and one of server side programming languages • In LIS schools curricula, there are subjects covering all of those different areas, so it is reasonable to conclude that LIS student`s competencies after graduation are enough for building such customized web application
Findings To evaluate the efficiency of the application, gathering data phase in the research was obtained first manually and then through the web application. Results were then compared. For the analysis and comparison, 104 web pages of the research sample (10%) were examined. Overall time necessary for data gathering in the analysis of one web page and of all web pages was measured, as well as the features of the analysis. Data gathering phase of the specific research project consists of: • finding and clicking on a specific URL address • searching for visual attributes on the web page and marking them • data importing in MS Excel format for further analysis Results of the comparison conducted in the data gathering phase of the visual research shows: • total amount of the time spent on data gathering phase through the web application is almost three times less than time spent when data was gathered manually. • In the automatic process of the analysis, data importing in MS Excel format is skipped because of the features of the application • clicking and uploading of the specific URL address is faster using a web application, since all IP addresses are imported in the application before the analysis Main advantages of the customized web application for visual content analysis of web pages are also: • the ability to edit and change added IP addresses, attributes, categories and gathered data • the ability to export gathered data in MS Excel format • the ability to visually present gathered data instantly on web
Conclusion • Data gathering phase of qualitative research method in visual communication studies on website is extremely complex and time consuming • The aim of this paper is to present a customized system providing IT support in the process of quantitative data gathering • For the specific visual content analysis research of the web pages, a web application shows better results in all aspects of the data gathering phase, since none of existing IT tools for content analysis is suitable for visual content analysis of visual graphic elements of web pages • Main conclusions of the research are that the use of customized IT support in visual content analysis reduces time necessary for data gathering and increases data credibility • Some of the main advantages of such application are the ability to edit and change added IP addresses, attributes, categories and gathered data, to export gathered data in MS Excel format and to visually present gathered data instantly on web • Clicking and uploading of the specific URL address is faster using web application and the possibility of errors is much smaller • In conclusion, the LIS students during their education gain knowledge and competencies necessary for building a custom web application for specific research demands
REFERENCES • Ashcroft, L. (2004). Developing competencies, critical analysis and personal transferable skills in future information professionals. Library Review 53(2), pp. 82 – 88. • Bauer, M. (2000). Classical content analysis: A review. In M. W. Bauer & G. Gaskell (Eds.), Qualitative researching with text, image, and sound: A practical handbook (pp. 131-151). London: Sage. • Bates, M. J. & Lu, S. (1997). An exploratory profile of personal home pages: Content, design, metaphors. Online and CDROM Review, 21(6), pp. 331-340. • Bell, P. (2002). Content Analysis of Visual Images. In Van Leeuwen, T. & Jewit, C. (Eds.), Handbook of Content Analysis (pp. 15-34) SAGE Publication: London. • Dragija-Ivanovic, M.; Faletar, S.; Pehar F.; Aparac-Jelusic T. (2003). The needs of the archives, libraries and museums community: a preliminary research report. Coping with continual change – change management in SLIS (Eds. Ashcroft L.). Proceedings of the European Association for Library and Information Education and Research (EUCLID) and the Association for Library and Information Science Education (ALISE) Joint Conference, Potsdam, Germany, pp. 46-58. • Fraternali, P. (1999). Tools and Approaches for Developing Data-Intensive Web. ACM Computing Surveys, 31(3), pp. 227-263. • Hanson-Baldauf, D. & Hassell, H. S. (2009). The information and communication technology competencies of students enrolled in school library media certification programs. Library & Information Science Research 31(1), pp. 3-11. • Machlis, S. (2011). 22 free tools for data visualization and analysis. Retrieved May 22, 2014 from http://www.computer-world.com/s/article/9215504/22_free_tools_for_data_visualization_and_analysis • McMillan, S. J. (2000). The microscope and the moving target: The challenge of applying content analysis to the World Wide Web. Journalism and Mass Communication Quarterly, 77(1), pp. 80-98. • Paton, B. (2011). Presenting Complex Data Visually: Using web-based tools to make your development data travel. Retrieved May 15, 2014 from http://www.researchtoaction.org/2011/09/ • presenting-complex-data-visually-using-web-based-tools-to-make-your-development-data-travel/ • Weare, C. & Lin, W. Y. (2000). Content analysis of the World Wide Web—Opportunities and challenges. Social Science Computer Review, 18(3), pp. 272-292. • White, M. D. & Marsh, E. E. (2006). Content Analysis: A Flexible Methodology. In Research Methods (Eds. Lynda M. Baker). Library Trends, 55(1), pp. 22-45.