877.820.0581

Engineering Critical Thinking

Jennifer A. Kurtz, MBA

Has technology served us so well that we decry our own abilities to respond?
Are we losing our capacity to trust? Not just in one another, but in ourselves and our capacity to think critically? Has technology served us so well that we decry our own abilities to respond, or we rush to purchase digital tools whose components are unfamiliar, assembly conditions are unknown, and imbedded logic is unspecified? To avoid accountability are we transferring too much responsibility to trusted technology?

Spreadsheets, for example, are helpful tools for analysis of the quantifiable past. The tool's limitation as a predictive tool is quickly apparent when reviewing business plans whose arithmetic reasoning is unerring, but whose basic assumptions and market understanding are flawed. Critical human thinking is needed.

Development of cars that extend "automatic" features to "autonomous" features continues with the justification that drivers are really too busy tending to critical digital communication devices than to be bothered with the tedium of driving -- or that human drivers are just not sufficiently capable of handling cars safely.  Future generations of smart cars should sense the proximity of other cars, the mood of the driver, traffic and road conditions, and calculate necessary adjustments in real-time. Engineered transportation management did not work out well for the 40 people who died in the 2011 Chinese bullet train collision. A lightning strike had derailed (as it were) a signaling mechanism.  Imagine a network or signal failure along the DC Beltway during rush hour. Critical human thinking is needed.

Social media is another area in which critical thinking may be reengineered. A recent study indicates that Twitter bots can "catalyze new human-to-human connections."  (HAL of 2001: A Space Odyssey would applaud.) Socially engineering a new trusted relationship or authority is a Google search feature that comes gratis through its decision to incorporate data from its Google+ network into its search algorithms. Although this engineering might not qualify as "evil," the adjective "monopolistic" came to the mind of some competitors and the Federal Trade Commission. Pursuing the social engineering thread leads to Google's recent decision to simplify its privacy policies. These policies cover its 60+ product array of search tools, account management products, advertising services, communication and publishing tools, development resources, map-related products, statistical tools, operating systems, and desktop and mobile applications. Not the intended consequence of the 2011 FTC ruling on Google's privacy practices, perhaps. And the policy is automatic: You Google, you accept. Critical human thinking is needed.

So what is our role if critical thinking is engineered into our tools and we abdicate responsibility?  We remain that necessary interface between product and cash. Res emptito ergo sum. [TRANS: I shop, therefore I am.] We can, of course, choose to educate ourselves out of that weak position. One attractive aspect of the information assurance master's degree online at Regis University is the focus on critical thinking. Students learn the art of interrogation: how to challenge the authenticity of information transmitted, processed, stored. Who is it from? How did it get here? How do I know the message or its metadata (the data about the data) has not been compromised? Students practice using tools -- and common sense -- to understand how to protect information from misuse and guide others to better decisions about the responsible use of information and technology. Critical human thinking is needed!

Request more information or call 877-820-0581 to learn more about the online master’s in information assurance at Regis.

 [iU.S. Tests Whether Consumers Like Car-to-Car Communications (January 24, 2012). By Susan Kuchinskas. MIT Technology Review. Retrieved from: http://www.technologyreview.com/business/39520/
ii At least 32 die in east China high-speed train crash. (July 23, 2011). By Ben Blanchard. Reuters.
Retrieved from: http://www.reuters.com/article/2011/07/23/us-china-train-idUSTRE76M26T20...
iii Twitter Bots Create Surprising New Social Connections (January 23, 2012). By Mike Orcutt. MIT Technology Review. Retrieved from: http://www.technologyreview.com/web/39497/?mod=chfeatured
iv Google facing expanded antitrust probe over social search service (January 13, 2012).
By Cecilia Kang. The Washington Post. Retrieved from: http://www.washingtonpost.com/business/technology/google-facing-expanded...
 v FTC Charges Deceptive Privacy Practices in Google's Rollout of Its Buzz Social Network. FTC press release dated 03/30/2011. Retrieved from: http://www.ftc.gov/opa/2011/03/google.shtm.
vi Google announces privacy changes across products; users can’t opt out (January 24, 2012). By Cecilia Kang. The Washington Post. Retrieved from: http://www.washingtonpost.com/business/technology/google-tracks-consumers-across-products-users-cant-opt-out/2012/01/24/gIQArgJHOQ_story.html?wpisrc=al_comboNE_b