Topics: The increasing accessibility of computational tools and methods, systems, networks, data-driven decision making, the key player problem, neotopology, and pushback from specialists.
In machine learning and natural language processing, a topic model is a type of statistical model for discovering the abstract "topics" that occur in a collection of documents. Intuitively, given that a document is about a particular topic, one would expect particular words to appear in the document more or less frequently: "dog" and "bone" will appear more often in documents about dogs, "cat" and "meow" will appear in documents about cats, and "the" and "is" will appear equally in both. A document typically concerns multiple topics in different proportions; thus, in a document that is 10% about cats and 90% about dogs, there would probably be about 9 times more dog words than cat words. A topic model captures this intuition in a mathematical framework, which allows examining a set of documents and discovering, based on the statistics of the words in each, what the topics might be and what each document's balance of topics is.Network theory - Wikipedia, the free encyclopedia
Network theory is an area of computer science and network science and part of graph theory. It has application in many disciplines including statistical physics, particle physics, computer science, electrical engineering, biology, economics, operations research, and sociology. Network theory concerns itself with the study of graphs as a representation of either symmetric relations or, more generally, of asymmetric relations between discrete objects. Applications of network theory include logistical networks, the World Wide Web, Internet, gene regulatory networks, metabolic networks, social networks, epistemological networks, etc. See list of network theory topics for more examples.Spatial analysis - Wikipedia, the free encyclopedia
Spatial analysis or spatial statistics includes any of the formal techniques which study entities using their topological, geometric, or geographic properties. Spatial analysis includes a variety of techniques, many still in their early development, using different analytic approaches and applied in fields as diverse as astronomy, with its studies of the placement of galaxies in the cosmos, to chip fabrication engineering, with its use of 'place and route' algorithms to build complex wiring structures. In a more restricted sense, spatial analysis is the techniques applied to structures at the human scale, most notably in the analysis of geographic data.Web scraping - Wikipedia, the free encyclopedia
Web scraping (web harvesting or web data extraction) is a computer software technique of extracting information from websites. Voyant Tools: Reveal Your Texts
Voyant Tools is a web-based reading and analysis environment for digital texts. Find out more.D3.js - Data-Driven Documents
D3.js is a JavaScript library for manipulating documents based on data. D3 helps you bring data to life using HTML, SVG and CSS. D3’s emphasis on web standards gives you the full capabilities of modern browsers without tying yourself to a proprietary framework, combining powerful visualization components and a data-driven approach to DOM manipulation.Information visualization - Wikipedia, the free encyclopedia
Information visualization or information visualisation is the study of (interactive) visual representations of abstract data to reinforce human cognition. The abstract data include both numerical and non-numerical data, such as text and geographic information. However, information visualization differs from scientific visualization: "it’s infovis [information visualization] when the spatial representation is chosen, and it’s scivis [scientific visualization] when the spatial representation is given".Data-Driven Decision Making: Promises and Limits - The CIO Report - WSJ
Decision making has long been a subject of study and given the explosive growth of Big Data over the past decade, it’s not surprising that data-driven decision making is one of the most promising applications in the emerging discipline of data science.Natural language processing
Natural language processing (NLP) is a field of computer science, artificial intelligence, and linguistics concerned with the interactions between computers and human (natural) languages. As such, NLP is related to the area of human–computer interaction. Many challenges in NLP involve natural language understanding, that is, enabling computers to derive meaning from human or natural language input, and others involve natural language generation.The objects of discourse analysis—discourse, writing, conversation, communicative event—are variously defined in terms of coherent sequences of sentences, propositions, speech, or turns-at-talk. Contrary to much of traditional linguistics, discourse analysts not only study language use 'beyond the sentence boundary', but also prefer to analyze 'naturally occurring' language use, and not invented examples. Text linguistics is related. The essential difference between discourse analysis and text linguistics is that it aims at revealing socio-psychological characteristics of a person/persons rather than text structure.System - Wikipedia, the free encyclopedia
A system is a set of interacting or interdependent components forming an integrated whole[1] or a set of elements (often called 'components' ) and relationships which are different from relationships of the set or its elements to other elements or sets.The Key Player Problem by Steve Borgatt
The key player problem (KPP) consists of two separate sub-problems, which can be described at a general level as follows:1. (KPP-1) Given a social network, find a set of k nodes (called a kp-set of order k) which, if removed, would maximally disrupt communication among the remaining nodes.2. (KPP-2) Given a social network, find a kp-set of order k that is maximally connected to all other nodes.ORBIS: The Stanford Geospatial Network Model of the Roman World
ORBIS: The Stanford Geospatial Network Model of the Roman World reconstructs the time cost and financial expense associated with a wide range of different types of travel in antiquity. The model is based on a simplified version of the giant network of cities, roads, rivers and sea lanes that framed movement across the Roman Empire. It broadly reflects conditions around 200 CE but also covers a few sites and roads created in late antiquity.Social justice - Wikipedia, the free encyclopedia
Social justice is the ability people have to realize their potential in the society where they live. Gephi is an interactive visualization and exploration platform for all kinds of networks and complex systems, dynamic and hierarchical graphs.The Programming Historian
In this lesson you will first learn what topic modeling is and why you might want to employ it in your research. You will then learn how to install and work with the MALLET natural language processing toolkit to do so. MALLET involves modifying an environment variable (essentially, setting up a short-cut so that your computer always knows where to find the MALLET program) and working with the command line (ie, by typing in commands manually, rather than clicking on icons or menus).i am – the scottbot irregular
A historian of science, a data enthusiast, and a juggler, and the scottbot irregular is my central webspace.Nature is unevenly distributed in and across cities despite the fact that much else about cities scales with population. The City Nature project combines spatial analysis of parks and other natural areas in cities with text mining of planning documents and published historical narratives to explore why.Algorithms are a fascinating use case for visualization. To visualize an algorithm, we don’t merely fit data to a chart; there is no primary dataset. Instead there are logical rules that describe behavior. This may be why algorithm visualizations are so unusual, as designers experiment with novel forms to better communicate. This is reason enough to study them. But algorithms are also a reminder that visualization is more than a tool for finding patterns in data. Visualization leverages the human visual system to augment human intellect: we can use it to better understand these important abstract processes, and perhaps other things, too.Neotopology | Digital Humanities at Stanford
I titled my talk neotopology because I believe that this network turn resembles the spatial turn in more ways than one. One of those ways is the presence of a vibrant community of practice growing outside the traditional domains where in one case network analysis/visualization and in the other case spatial analysis/visualization traditionally occurs. Neogeography, in short, refers to the use of cartographic and spatial analytic tools and techniques by designers, humanists, software developers and others outside of the traditional domains of GIS, GIScience and cartography. It has, at times, caused some consternation and led to fierce claims both in digital humanities and in the tech industry that the geographic information systems being developed by such interlopers are, "Not GIS". It has also elicited, from time to time, angry responses from these traditional domains directed at flaneurs running amok with ArcGIS and Google Maps API. Which itself has also spurred defensiveness.Liberation theology - Wikipedia, the free encyclopedia
Liberation theology is a political movement in Roman Catholic theology which interprets the teachings of Jesus Christ in relation to a liberation from unjust economic, political, or social conditions. It has been described as "an interpretation of Christian faith through the poor's suffering, their struggle and hope, and a critique of society and the Catholic faith and Christianity through the eyes of the poor". Detractors have called it Christianized Marxism.