AI, data, big tech, metrics

Antikythera 50BC  about


see also > AI, algorithms, intelligence


  • big tech/internet/AI
  • data/ metrics/stats social science
  • AI/surveillance/targets/performance
  • data/accounting

big tech/internet


ft.com 9/11/2021 We need to talk about techie tunnel vision  Gillian Tett

AI, data, metrics, big tech,


wired.co.uk   6/2021   It’s time to ditch Chrome – As well as collecting your data, Chrome also gives Google a huge amount of control over how the web works  by Kate  O’Flaherty

Despite a poor reputation for privacy, Google’s Chrome browser continues to dominate. The web browser has around 65 per cent market share and two billion people are regularly using it. Its closest competitor, Apple’s Safari, lags far behind with under 20 per cent market share. That’s a lot of power, even before you consider Chrome’s data collection practices.


seekingalpha.com   16/3/2021  Palantir Rising: War In The Garden Of Google   by Vincent Ventures   459 Comments   

summary/excerpts

  • In the sequel to our Wintel Wars article, we offer the ultimate deep dive into the tectonic shifts happening in the tech industry, giving unparalleled context to Karp’s cryptic musings.
  • We lay out how Google and Facebook gradually created privatized versions of secretive DARPA programs, and why Palantir and Apple are leading a push for privacy as industry tensions ignite.
  • We discuss how Palantir threatens the current paradigm with software for a decentralized alternative. Europe’s GAIA-X offers significant upside potential as Palantir seeks a new European hub in Switzerland.
  • While the last decade focused on the consumer internet, the Palantir+IBM partnership is a harbinger of how the next decade will be shaped by software platforms for the industrial internet.
  • Calls for the US government to provide R&D funding for the technology sector provide an additional catalyst. These sentiments were recently echoed by In-Q-Tel employees writing for a Council on Foreign Relations publication.
Apple Ahead

This is not the story of ambitious young entrepreneurs. This is the story of how private versions of some of the most advanced defense projects ever conceived were built by a company that has sought to influence US politics, unilaterally engaged in clandestine activities, and developed its own foreign policy… all in the pursuit of creating artificial intelligence. It is a complex matrix of competing objectives, technologies, and business models that investors will need to carefully consider in order to navigate the 2020s.

This is a power struggle that will have significant consequences. The most powerful companies in the world are fighting for control over the world’s most important asset: Data. If the computer is the greatest tool mankind has ever created, then the outcome of this corporate conflict will inevitably define the future of humanity.

At the turn of the millennium, the internet was highly decentralized, relying on networks of servers connected by ISPs. By the end of the decade, much of the internet infrastructure had consolidated amongst the “big tech” companies. Their “hyperscale” cloud datacenters dwarfed the ISPs. Google and Facebook were able to leverage this grip on the infrastructure of the internet to create mass surveillance enterprises that put the Orwellian powers of programs like Total Information Awareness and LifeLog into the hands of private corporations.


economics / data


medium.com   2020    Is data nonrivalrous?    Will Rinehart

Charles I. Jones and Christopher Tonetti  ” … are upfront in their goals in that the “paper develops a theoretical framework to study the economics of data.” Continuing, they write, “the starting point for our analysis is the observation that data is nonrival. That is, at a technological level, data is infinitely usable.”
The concept of rivalrous goods was first laid out by Vincent Ostrom and Elinor Ostrom in a book chapter titled, “Public Goods and Public Choices.” Previous to this work, economists like Samuelson and Musgrave emphasized exclusion. …   However, the model falls apart if there is no scale effect associated with this transferred data, such that “each firm learns only from its own consumers.” …  For Jones and Tonetti, data is understood to be a radically non-specific asset where massive data sets easily yield more output. This model primitive is important. If firms are unlikely to learn productive insights from the data of others, then “there is no scale effect associated with data” and policy regimes to expand data access through property rights would be nullified. In practice, data doesn’t transfer easily. …    This paper from Jones and Tonetti exemplifies the cutting edge of economic research in information. Like all research, however, it needs contextualization. The way data rivalry is defined and is then made into a model fails to capture the complexity of the real world. As such, attentive readers should be skeptical of the authors’ policy prescriptions. Data transferability is complex and public policy proposals need to take into account that complexity by avoiding simple models of a complex world.


data/metrics/stats social science


academia.edu/pdf  2021  Computational Thinking and Social Science Education  Seema Shukla Ojha

“Computational thinking is one of the biggest buzzwords in education nowadays. It has evenbeen called the 5th C of 21st-century skills. The reason for its emerging popularity is that it isengaging. Ifgivenanopportunity, weallwouldliketoplaywithadatasetratherthanlisteningto someone telling us about the data set. Computational thinking as a term was popularized in2006 by Jeanette Wing and became linked with twenty-rst-century skills. Wing argued thatcomputational thinking is “everywhere” and “for everyone.” Computational thinking is saidtobeanapproachinwhichonebreaksdownproblemsintodistinctparts,looksforsimilarities,identies the relevant information and opportunities for simplication, and creates a plan fora solution. This broad problem-solving technique includes the following four elements:
  •  Decomposition -Breaking down problems into smaller sections.
  •  Pattern recognition -Examining the problem for patterns, or similarities to previouslysolved problems.
  •  Abstraction -Generalization of a problem — focus on the big picture and what’s im-portant
  •  Algorithms -Solving problems through step by step instructions.

While applying above mentioned four elements of computational thinking to social sci-ence education, these were found to be highly technical, not relevant to social science, whichhas specic disciplinary needs across its multiple curricular contexts: history, geography,economics, and more (Hammond, Oltman & Manfra 2020). This led some scholars to selectand adapt a list of computational thinking skills mentioned below for social science purposes(Hammond, Oltman & Manfra 2020)

The above-mentioned formulation of “Data, Patterns, Rules and Questions” (DPR-Q) wascreated as a method for integrating computational thinking into social studies education.In simple words, we can understand that computational thinking is a set of problem-solving strategies that is intended, but not required, to take advantage of computers. Compu-tational thinking is not that dierent from critical thinking. Computational thinking is simplywaytoprocessinformationusinghigher-orderorcriticalthinking,saysJulieOltman. Whetherit is taught in coding class or social studies, the framework is the same: look at the providedinformation, narrow it down to the most valuable data, and patterns and identify themes…”…


lithub.com  10/3/2021  Is Data the Western World’s New Religion?  Tim Harford in Conversation with Andrew Keen


goodreads.com  1999   How to Lie with Statistics  by Darrell Huff  –  This book introduces the reader to the niceties of samples (random or stratified random), averages (mean, median or modal), errors (probable, standard or unintentional), graphs, indexes and other tools of democratic persuasion.


AI / surveillance/ targets/ performance


ft.com 10/2021 Surveillance  Darren Byler, C Shepherd


theguardian.com  2019  Shoshana Zuboff: ‘Surveillance capitalism is an assault on human autonomy’
What began as advertising is now a threat to freedom and democracy argues the author and scholar. Time to wake up – and fight for a different digital future


theguardian.com   26/10/2021     ‘Conditioning an entire society’: the rise of biometric data technology – The use of our bodies to unlock access to services raises concerns about the trade-off between convenience and privacy  …”Experts are concerned that biometric data systems are not only flawed in some cases, but are increasingly entering our lives under the radar, with limited public knowledge or understanding…”…


theguardian.com/  2/0121 Home schooling: ‘I’m a maths lecturer – and I had to get my children to teach me’
Many parents struggle with home schooling in lockdown. But how are three experts in maths, English and science faring?    Kit Yates,


ALT socio eco metrics/ ESG etc

Read or downlod PDF here   2014  “Towards an operational measurement of socio-ecological performance” by Sigrid Stagl, Claudia Kettner, Angela Köppl

Abstract
Questioning GDP as dominant indicator for economic performance has become commonplace. For economists economic policy always aims for a broader array of goals (like income, employment, price stability, trade balance) alongside income, with income being the priority objective. The Stiglitz-SenFitoussi Commission argued for extending and adapting key variables of macroeconomic analysis. International organisations such as the EC, OECD, Eurostat and UN have proposed extended arrays of macroeconomic indicators (see ‘Beyond GDP’, ‘Compendium of wellbeing indicators’, ‘GDP and Beyond’, ‘Green Economy’, ‘Green Growth’, ‘Measuring Progress of Societies’). Despite these high profile efforts, few wellbeing and environmental variables are in use in macroeconomic models. The reasons for the low uptake of socio-ecological indicators in macroeconomic models range from path dependencies in modelling, technical limitations, indicator lists being long and unworkable, choices of indicators appearing ad hoc and poor data availability. In this paper we review key approaches and identify a limited list of candidate variables and – as much as possible – offer data sources.


  • Sustainability Indicators and Ecosystem and Land use Accounting,
  • Environmental Accounting and Reporting at Micro Level,
  • Accounting of Environmental Activities,
  • Material, Energy and Carbon Accounting,
  • Measurement of Decoupling, National Accounts’ Adjustment, Damage Valuation,
  • Population Census 2010 as a Tool for Environmental Policy

medium.com/    2016 Maslow’s Hierarchy of Needs vs. The Max Neef Model of Human Scale development by Neha Khandelwal

the-max-neef-model-of-human-scale-development

data / accounting


economist.com/    2020  The economic crisis will expose a decade’s worth of corporate fraud
Downturns are accounting crooks’ worst enemy

… “Non-GAAP adjustments have spread like wildfire through corporate accounts, making it harder to discern what numbers reflect a firm’s true financial position. The average number of non-gaap measures used in filings by companies in the s&p 500 index has increased from 2.5 to 7.5 in the past 20 years, according to pwc, a consultancy. In credit agreements analysed by Zion Research Group, the definition of ebitda ranges from 75 words to over 2,200. gaap is far from perfect, but some of the divergence from it has clearly been designed to pull wool over investors’ eyes. One study found that non-gaap profits were, on average, 15% higher than gaap profits.

Playing around with earnings and revenue-recognition metrics is this generation’s equivalent of dotcoms using bots and other tricks to boost “eyeballs” 20 years ago, says Jules Kroll of k2 Intelligence, the doyen of corporate sleuths. “When an area is hot to the point of overheated, there is a growing temptation to juice the numbers.” In an ominous sign, SoftBank, a Japanese technology conglomerate which bet big on WeWork and dozens of other startups, said this week that it expects an operating loss of ¥1.4trn ($12.5bn) in its last fiscal year.

Besides exposing old schemes, the pandemic is likely to give rise to new ones. When economic survival is threatened, the line separating what is acceptable and unacceptable when booking revenues or making market disclosures can be blurred. Mr Kroll reckons that “amid such massive dislocation, some will inevitably cheat.”

Bruce Dorris, head of the Association of Certified Fraud Examiners, the world’s largest anti-fraud outfit, says the effects of covid-19 look like “a perfect storm for fraud”. It may engender everything from iffy accounting to stimulus-linked scams as thousands of firms—including bogus applicants—hustle for help. One fraud investigator points to private-equity-owned firms as potential targets. “There are lots of them, they are highly leveraged and they may not qualify for bail-outs because they have deep-pocketed sponsors,” he says. That increases the temptation to resort to unseemly practices. The ebbing tide is likely to reveal plenty of corporate nudity. That will not stop some businesses from taking up naturism. ”