American Sociological Association

Search

Search

The search found 112 results in 0.022 seconds.

Search results

  1. Neoliberalism

    Johanna Bockman unpacks a hefty term, neoliberalism. She cites its roots and its uses, decoding it as a description of a “bootstraps” ideology that trumpets individualism and opportunity but enforces conformity and ignores structural constraints.

  2. Seeing Disorder: Neighborhood Stigma and the Social Construction of “Broken Windows”

    This article reveals the grounds on which individuals form perceptions of disorder. Integrating ideas about implicit bias and statistical discrimination with a theoretical framework on neighborhood racial stigma, our empirical test brings together personal interviews, census data, police records, and systematic social observations situated within some 500 block groups in Chicago. Observed disorder predicts perceived disorder, but racial and economic context matter more.

  3. The Spatial Proximity and Connectivity Method for Measuring and Analyzing Residential Segregation

    In recent years, there has been increasing attention focused on the spatial dimensions of residential segregation—from the spatial arrangement of segregated neighborhoods to the geographic scale or relative size of segregated areas. However, the methods used to measure segregation do not incorporate features of the built environment, such as the road connectivity between locations or the physical barriers that divide groups. This paper introduces the spatial proximity and connectivity (SPC) method for measuring and analyzing segregation.
  4. Estimating the Relationship between Time-varying Covariates and Trajectories: The Sequence Analysis Multistate Model Procedure

    The relationship between processes and time-varying covariates is of central theoretical interest in addressing many social science research questions. On the one hand, event history analysis (EHA) has been the chosen method to study these kinds of relationships when the outcomes can be meaningfully specified as simple instantaneous events or transitions.
  5. Comment: The Inferential Information Criterion from a Bayesian Point of View

    As Michael Schultz notes in his very interesting paper (this volume, pp. 52–87), standard model selection criteria, such as the Akaike information criterion (AIC; Akaike 1974), the Bayesian information criterion (BIC; Schwarz 1978), and the minimum description length principle (MDL; Rissanen 1978), are purely empirical criteria in the sense that the score a model receives does not depend on how well the model coheres with background theory. This is unsatisfying because we would like our models to be theoretically plausible, not just empirically successful.
  6. Comment: Evidence, Plausibility, and Model Selection

    In his article, Michael Schultz examines the practice of model selection in sociological research. Model selection is often carried out by means of classical hypothesis tests. A fundamental problem with this practice is that these tests do not give a measure of evidence. For example, if we test the null hypothesis β = 0 against the alternative hypothesis β ≠ 0, what is the largest p value that can be regarded as strong evidence against the null hypothesis? What is the largest p value that can be regarded as any kind of evidence against the null hypothesis?
  7. Comment: Bayes, Model Uncertainty, and Learning from Data

    The problem of model uncertainty is a fundamental applied challenge in quantitative sociology. The authors’ language of false positives is reminiscent of Bonferroni adjustments and the frequentist analysis of multiple independent comparisons, but the distinct problem of model uncertainty has been fully formalized from a Bayesian perspective.
  8. Comment: Some Challenges When Estimating the Impact of Model Uncertainty on Coefficient Instability

    I once had a colleague who knew that inequality was related to an important dependent variable. This colleague knew many other things, but I focus on inequality as an example. It was difficult for my colleague to know just how to operationalize inequality. Should it be the percentage of income held by the top 10 percent, top 5 percent, or top 1 percent of the population? Should it be based on the ratio of median black income to median white income, or should it be the log of that ratio? Should it be based on the Gini index, or perhaps the Theil index would be better?
  9. Practicing Intersectionality in Sociological Research: A Critical Analysis of Inclusions, Interactions, and Institutions in the Study of Inequalities

    In this article we ask what it means for sociologists to practice intersectionality as a theoretical and methodological approach to inequality. What are the implications for choices of subject matter and style of work? We distinguish three styles of understanding intersectionality in practice: group-centered, process-centered, and system-centered. The first, emphasizes placing multiply-marginalized groups and their perspectives at the center of the research.

  10. What Are Dual Process Models? Implications for Cultural Analysis in Sociology

    In this paper we introduce the idea of the dual process framework (DPF), an interdisciplinary approach to the study of learning, memory, thinking, and action. Departing from the successful reception of Vaisey (2009), we suggest that intradisciplinary debates in sociology regarding the merits of “dual process” formulations can benefit from a better understanding of the theoretical foundations of these models in cognitive and social psychology.