Hot News‎ > ‎

Vanishing Canada: Why we’re all losers in Ottawa’s war on data

posted Sep 23, 2015, 7:17 PM by SOS SaveOceanScience   [ updated Sep 23, 2015, 7:53 PM ]

Records deleted, burned, tossed in Dumpsters. A Maclean’s investigation on the crisis
in government data


Anne Kingston
September 18, 2015

CANADA-story


When told that his small Prairie town had, in profound ways, fallen off the statistical map of Canada, Walter Streelasky
,
mayor of Melville, Sask., is incredulous. Streelasky had no idea Melville had been rendered a “statistical ghost town”
after the mandatory long-form census was cut in 2010, and fewer than 50 per cent of the one third of Melville’s 4,500
residents who got the voluntary National Household Survey that replaced it in 2011 completed the form. Melville still
exists—but as a shadow. We know how many people live there, but nothing about them—where they work, their
education levels, whether they’re married, single or divorced, how many are immigrants, how many are unemployed,
how many live in poverty. Melville’s numbers, then, aren’t factored into Canadian employment numbers or divorce
rates or poverty rates. According to
Sask Trends Monitor, the high non-response rate in the province resulted in
“no socioeconomic statistics about the populations in about one-half of Saskatchewan communities.” Nationally,
we’re missing similar data on 20 per cent of StatsCan’s 4,556 “census subdivisions,” making a fifth of Canada’s
 recognized communities statistical dead zones.

“To be dropped off the face of the Earth is pretty frightening,” says Streelasky, noting that Melville appears very
 much alive from his office: “We can smell the wildfires burning.” He plans to discuss the situation with his MP: “It’s
the obligation of the federal government to make national data collection as complete as possible.”

Towns like Melville are far from the only entities vanishing from official Canadian records. Physicist Raymond Hoff,
who published more than 50 reports on air pollution in transport and toxic chemicals in the Great Lakes—including
pioneering work on acid rain—at Environment Canada between 1975 and 1999, doesn’t seem to exist, either.
 “Nothing comes up when I type my name into the search engine on [Environment Canada’s] website,” says Hoff,
 now a professor emeritus at the University of Maryland. Also gone are internal reports on the oil sands experiments
 of the 1970s. “That research was paid for by the taxpayer. Now, the people who need to protect Canada’s
 environment can’t get access.”

Protecting Canadians’ access to data is why Sam-Chin Li, a government information librarian at the University of
Toronto, worked late into the night with colleagues in February 2013, frantically trying to archive the federal
Aboriginal Canada portal before it disappeared on Feb. 12. The decision to kill the site, which had thousands of
 links to resources for Aboriginal people, had been announced quietly weeks before; the librarians had only days to
 train with web-harvesting software.

The need for such efforts has taken on new urgency since 2014, says Li, when some 1,500 websites were
 centralized into one, with more than 60 per cent of content shed. Now that reporting has switched from print to
digital only, government information can be altered or deleted without notice, she says. (One example:
 In October 2012, the word “environment” disappeared entirely from the section of the Transport Canada
website discussing the Navigable Waters Protection Act.)
(Derek Mortensen)

(Derek Mortensen)

Stories about government data and historical records being deleted, burned—even tossed into
Dumpsters—have become so common in recent years that many Canadians may feel inured to
them. But such accounts are only the tip of a rapidly melting iceberg. A months-

long Maclean’s investigation, which includes interviews with dozens of academics, scientists,
statisticians, economists and librarians, has found that the federal government’s “austerity”
program, which resulted in staff cuts and library closures (16 libraries since 2012)—as well as
arbitrary changes to policy, when it comes to data—has led to a systematic erosion of
government records far deeper than most realize, with the data and data-gathering capability
we do have severely compromised as a result.

Statistics Canada no longer provides a clear snapshot of the country, says John Stapleton, a
Toronto-based social policy consultant. “Our survey data pixelates—it’s a big blur. And the
 small data, we don’t know if it’s right.” tweet this

How many Canadians live in poverty now, compared to 2011? We don’t know; changes in
income-data collection has made it impossible to track. Austerity measures, ironically, have
resulted in an inability to keep track of the changes: StatsCan used to provide detailed,
comprehensive data on salaries and employment at all levels of government; now we can’t
tell where, or how deep, the cuts have been.

Related reading from Jonathon Gatehouse: When science goes silent

Disappearing data is only one part of a larger narrative of a degradation of knowledge—one
 that extends from federal scientists being prevented from talking about their research on topics
as mundane as snow to the Truth and Reconciliation Commission being forced to
  take the federal government to court to obtain documents that should have been available
under Access to Information. The situation has descended into farce: Library and Archives
Canada (LAC), entrusted with preserving historic papers, books, photographs, paintings, film
and artifacts, was so eroded by cuts that, a few years ago, author Jane Urquhart was unable to
access her own papers, donated to LAC in the 1990s.

The result is a crisis in what Canadians know—and are allowed to know—about themselves.
The threat this poses to a functioning democracy has been raised over the past several years,
most recently, in the massive, damning June 2015 report“Dismantling democracy: Stifling debate and dissent in Canada” produced by Voices-Voix, a non-partisan coalition of more than 200 organizations and
5,000 individuals.



[TAP TO TWEET]
Less discussed, however, is how data erasure also threatens the economy, industry, the arts,
and the country’s ability to compete internationally. The 2013 report “Information management in the Canadian federal government” is a title not likely to attract the non-librarian reader. But the conclusions
 drawn by its authors, a librarian at Carleton University and an information-management
consultant, are chilling. Isla Jordan and Ulla de Stricker describe a country “without access to
 large parts of its institutional memory, and leaders without access to the information needed for
strategic decision-making.” Toni Samek, a professor at the school of library and information
studies at the University of Alberta, puts it more succinctly. Canada is facing a “national amnesia,”
 she says, a condition that will block its ability to keep government accountable, remember its
past and plan its future.


Canada’s closed-data stance is taking root at the very moment “open data” and “knowledge
economy” are global mantras. The OECD and World Bank have led the charge for open-platform
 disclosures. In 2013, the U.S. Food and Drug Administration launched openFDA to provide easy
 public access. Last year, the U.S. Federal Reserve posted full and revealing transcripts of
meetings held by then chairman Ben Bernanke in the weeks and months leading to the 2008
 recession—there for anyone with an Internet connection to read. U.S. President Barack Obama’s
2016 budget calls for an emphasis on making data “legally and practically” more accessible.


Canada certainly talks the talk. Last year, Tony Clement, Conservative MP and Treasury Board
president, announced the Action Plan on Open Government 2014-16to “foster greater openness
 and accountability . . . and, at the same time, create a more cost-effective, efficient and
 responsive government.” Rona Ambrose, the minister of health, announced a Transparency
and Openness Framework
that included a commitment to begin “transparently publishing drug
 safety reviews.”

Those dependent on these data balk at such claims. “Health Canada has improved its
transparency in a few small areas, but overall, does an abysmal job,” says Joel Lexchin, a
drug-industry watchdog and professor in the school of health policy and management at
Toronto’s York University. “[It] doesn’t even make public a list of drugs withdrawn for safety
reasons,” he says.

Access to federal scientific data is equally dire. Canada is in the Dark Ages, compared with the
U.S., says biologist Jeremy Kerr, a professor at the University of Ottawa. Kerr works in climate
change, ecology and conservation—“data-hungry fields,” he says. “I do big-picture ecology,
where we think across countries and continents. But rapid scientific process stops at the
Canadian border.” Detailed information about Canada isn’t available, says Kerr, noting that his
American colleagues “make this data freely available.” Part of the problem is long-standing, he
says, arising from “the difficult nature of federal-provincial relationships.” But “accessibility of
 data in Canada is becoming less, not more,” he says. “We are not collecting a lot of data that
 used to be routine.”

Canada not only lags other governments, but also international business, says Jan Kestle,
founder and president of Toronto-based Environics Analytics: “Everyone is moving toward
setting up data-governance processes inside companies to collect information and safeguard
 it,” she says.

Related reading: A rising tide of anti-intellectual thinking

But where digitization has helped other governments and companies make more information
available, it is having the opposite effect here. The edict to eliminate information deemed
 “redundant, outdated and trivial” (known as “ROT”) gives federal managers licence to decide
 what data should be cut and what kept, says Li, the U of T librarian. “There is no transparency,
oversight, or published criteria for the decision-making process,” she says. While the U.S
 Federal Depository Library Program tracks U.S. government-publication digitization efforts,
Canada has no such mechanism. It’s LAC’s mandate to preserve federal government
information, but there has not been a comprehensive web crawl since November 2008.

LAC is updating its “technical infrastructure,” a spokesperson told Maclean’s, and should have
missing web archive content online by early 2016. In an interview withMaclean’s, LAC head Guy
Berthiaume spoke of making LAC a “client-driven organization,” developing a three-year plan and
 digitizing a quarter of its archives. But the organization has suffered a 50 per cent cut in its
digital staff, and received no additional funding in the 2015 budget.

Yet elsewhere in government, claims of “digitization” can be a precursor to brick-and-mortar
closures. Last month, the Professional Institute of the Public Service of Canada(PIPSC), the
union representing some 15,000 federal scientists, claimed that Agriculture and Agri-Food
 Canada’s Lethbridge Research Centre, created in 1906, had been closed quietly. Agriculture
and Agri-Food Canada spokesman Patrick Girard said it isn’t a shutdown, explaining that the
government was simply “moving toward a digital-service delivery model, while keeping all
materials of business value.” But according to PIPSC, its members are losing vital data. “They
will have access to some information but in no way will they have full access; that’s not how
digitizing works,” says Peter Bleyer, special advisor at the union. Government reports note that
scientists were consulted in the process; Bleyer says they weren’t. Parsing the truth has become
 a national parlour game.

Economic considerations are cited routinely to justify cutbacks in collecting, analyzing and
digitizing information. A closer look at recent data erasure, however, suggests it runs counter to
sound economic strategy. The glaring example is the elimination of the mandatory long-form
census, a detailed survey of Canadians taken every five years. Its replacement, the voluntary
National Household Survey, added $22 million to the cost of the 2011 census; the response rate
dropped from 94 per cent in 2006 to 69 per cent, which makes the data totally unreliable. “A
response rate of 75 per cent is the minimum required for sample accuracy,” says StatsCan’s
former chief statistician, Munir Sheikh, who famously resigned in 2010 after Tony Clement, then
 industry minister, stated publicly that the decision to cut the long-form census came from within
StatsCan. “The federal government misrepresented my advice,” Sheikh toldMaclean’s, adding
 that ongoing cuts to the agency have undermined its credibility. StatsCan stands by the data:
“The results for the 2011 census are of very high quality, as in previous censuses,” says Peter
 Frayne of StatsCan.

Five years later, we are seeing the effects. Without the baseline provided by the long-form
census, says statistician Doug Elliott, who runs Regina consultancy QED Information Systems,
“when an employment rate or CPI [consumer price index] doesn’t make any sense, your
 immediate suspicion now is that the number is wrong, rather than trying to figure out why.
” Voluntary surveys also create biased data, says Sheikh: Response rates from the very rich,
the very poor, rural areas, immigrants and Aboriginal communities tend to be far lower—so
these groups are not well-represented. “People who do not respond well to a voluntary survey
are the very people social policy tries to help,” he says. “So if you were to base policy on data
received, you’d say, ‘Gee, we don’t have a poverty problem in this country.’ ”

“You see this continual silencing of people who are not ‘winners,’ for lack of a better word,” says
Armine Yalnizyan, a senior economist with the Ottawa-based Canadian Centre for Policy
 Alternatives. “The result is that the government can pretend they’re not there. It’s like, ‘If you’ve
got any needs, we can’t hear you, we can’t see you, la la la.’”

Sheikh believes we’d be better off with no census data than what we have now: “Then you do
 the best you can using reasoning, logic, and whatever other stats may be available [on which]
to base your decision. But when you have wrong information, chances are, it will put you on the
wrong path to policy development.” tweet this

Environics Analytics’ Kestle agrees, noting that the neighbourhood-level data provided by the
census was vital for businesses looking to understand local markets, to decide where to locate
or how to direct marketing. Businesses, including his own, have been hard-hit, says Elliott: “We’re
 in a labour-market crunch in Saskatchewan,” he says. “Businesses come to me and ask where
they’re going to get employees; we’re stuck with poor-quality data to help them.”

Government, too, is operating in the dark, as evidenced last year when StatsCan was unable to
provide auditor general Michael Ferguson with job data during the contentious debate over
 proposed reforms of the Temporary Foreign Worker Program. The Department of Finance was
relying on data from the online classified service Kijijito back its position.

As a result, economic decision-making is compromised, as a July ScotiaBank reportpoints out. It
 claimed it would be “ill-advised” for the Bank of Canada to make rate-cut decisions based on
StatsCan data, because it’s “stale”: “Canadian data might be following a similar trajectory to that
which we have observed in the U.S., but, unfortunately, the problem here is that the Canadian
data notoriously lags the U.S.,” it said. The report also noted that much of Canada’s trade data
on resources, especially energy, “is inferred, because it is not available on a timely basis.”

Lack of tracking capacity also blinds us to whether there have been improvements in the
inadequate housing and overcrowded conditions in the North, exposed in the 2006 census, 
says Yalnizyan: “We’ll never know whether that is improved or not.”

KINGSTON-pull-2 copy

When it comes to data, Canada has become “a cautionary tale,” says Phil Sparks, a former
associate director of the U.S. Census Bureau who is co-director of the Washington-based Census
 Project, a group fighting to maintain the U.S. equivalent of a mandatory long-term census in the
face of Republican calls for its elimination. He calls the Canadian experience “an unmitigated
disaster.” The census’s elimination damaged Canada’s international reputation, says John
Henstridge, president of the Statistical Society of Australia. “Prior to that, Statistics Canada was
regarded as possibly the best government statistical body in the world.”

But the census is far from the only issue; less discussed is the 2012 elimination of four key
longitudinal studies, some dating to the 1970s, which tracked health, youth, income and
employment. Economist Miles Corak, a professor at University of Ottawa who studies income
inequality and poverty, calls this a major informational loss, as well as “money down the drain.”
 “Longitudinal studies are very expensive,” Corak says, “but their value increases exponentially
with time.” He compares the loss to stopping watching a movie halfway: “Only after you follow a
 group of children for 12 or 15 years, and they’re on the cusp of entering the labour market, do
you have the capacity to see how adult success is foreshadowed by their family origins.”
Statistics tell a human story, Corak says: “We think of statistics as cold, but they are the real lives
 of people embedded in bits and bytes. They live and breathe.”

Related reading by John Geddes: Why Stephen Harper thinks he’s smarter than the experts

Cutting the Survey of Labour and Income Dynamics, a longitudinal study tracking economic
well-being since the mid-’90s, left the country unable to measure changes in income over the
 longer term, says economist Stephen Gordon, a professor at Université Laval. It was replaced
 by the Canadian Income Survey, which uses a different methodology; now, old income data can’t
 be connected with new income data, he says. The upshot? Comprehensive Canadian
 income-data history currently begins in 2012.

Gordon expresses alarm that 20 years of data history between 1960 and 1980 vanished in 2012
due to changes in the way national accounts, GDP and other data were compiled: “It’s now
 impossible to have a clear picture of the Canadian economy since the Second World War,” he
says. And that’s a huge problem for analysts who need to look at pressing concerns, such as the
 current oil price crash in context. “You want to look at data about oil prices rising in the ’70s, but
you can’t.”

Lost StatsCan studies have been replaced by new studies—but what the new data track can be
 telling. The Households and the Environment Survey, begun in 2013, for example, tracks
Canadians’ involvement with the environment—using measures such as birdwatching and
volunteerism. The latest data reveal that 25 per cent of households have bird houses or feeders,
and 18 per cent engaged in unpaid activities aimed at “conservation or protection of the
environment or wildlife.”

Yet tax-funded environmental monitoring, conservation and protection has been debilitated with
the closure of 200 scientific research institutions, many of which monitored food safety and
environmental contaminants. Some were internationally famous. The Polar Environment
Atmospheric Research Laboratory
in Nunavut, which played a key role in discovering a huge
hole in the ozone layer over the Arctic, closed in 2012. Also shuttered was a brand-new
climate-controlled facility at the St. Andrews Biological Station in New Brunswick. The original
 station provided writer Rachel Carson with documentation of DDT killing salmon in local rivers
reported in her 1960 book Silent Spring, credited with giving rise to environmentalism.

The data gap naturally affects policy. “How can Environment Canada know how pollution from
the oil sands has changed over the last 30 years, if they don’t have access to baseline reports?”
 asks Hoff, who reports that former Environment Canada colleagues call him for reports they can
no longer access internally. Fisheries scientist Jeffrey Hutchings, a professor at Dalhousie
 University, says he can’t find studies on cod stocks dating to the 19th century that he referenced
two decades ago at the now-closed St. John’s library, which had profound implications for cod
management. “The work I was able to do then couldn’t be done now.”

The effects of penny-wise, short-term thinking are being felt in some of our most important
 research organizations. Consider the changes at Canadian Institutes of Health Research (CIHR),
 the country’s primary health research funding agency, says Yalnizyan. Earlier this year, the CIHR
 outraged the scientific community when it announced that, as of September, it would no longer
 fund Cochrane Canada, part of a respected global collective known for “evidence-based,” system
atic reviews free from commercial sponsorship and conflict of interest. But outlays for Cochrane—less than $2 million annually—were cost-efficient: For the $100,000 CIHR pays for a single “knowledge
 synthesis,” Cochrane can produce five.

The National Research Council (NRC), the country’s pre-eminent scientific institution, has seen
 a similar erosion following its shift in focus from pure science to applied science. “The NRC was
 the Rolls-Royce of federal science,” says Kerr, the University of Ottawa biologist. It now defines
 itself on its website as a “concierge service”—“a single access point where small and
medium-sized enterprises can find high-quality, timely advice to help them innovate and
accelerate their growth.” Output has plummeted: Published research, in areas ranging from
medical technologies to astrophysics, declined from 1,425 reports in 2010 to 436 in 2012.
 Innovation, measured in patents filed, also declined, from three in 2010 to zero in 2012. “Their
research mission has been destroyed.”

Failing to invest in pure science is ultimately bad for business, says Katie Gibbs, executive
director of the advocacy group Evidence for Democracy. “It may not pay off in the short term, but
 it’s necessary to feed applied science,” she says, pointing out that many technological advances
 that drive our economy and quality of life—cellphones, satellites, GPS, MRIs, even Velcro—had
their starts in government-funded basic research.

Nowhere is the information deficit more acute than in Canadians’ ability to assess government’s
 own functioning. Assessing the performance of a government that turns—and campaigns—on
its economic record has been compromised as a result, a serious problem during a federal
 election, says Yalnizyan: “We have no income data post-2011 on a historically comparable basis,
 other than from tax records, which don’t give us information about families or poverty or inequality. I don’t think that is by accident.”

Corak, too, has concerns: “Income levels are something Canadians should be as aware of as
much as the inflation rate and unemployment rate: ‘How much money do they make on average?
 How is that distributed?’ ” Data erasure, unsurprisingly, is an election issue itself, with the NDP,
 Liberals and Greens all vowing to restore the long-form census.

In the absence of readily available information, a record number of individual Canadians are
turning to the mechanism to access internal government records and information, the Access to
Information and Privacy system
(ATIP). But that system is also a shambles. Fewer requests are
 being processed, at a more glacial pace with more redactions. Information commissioner
Suzanne Legault found in a 2015 report that only 21 per cent of access requests in the 2013-14
fiscal year resulted in information released, compared to 40 per cent in 1999-2000. Legault made
85 recommendations for reform, including extending coverage to the Prime Minister, ministers
and parliamentary secretaries. A recent change allowing government bureaucrats to determine
whether material is classifiable as “cabinet documents” that are exempt from ATIP concerns
Vincent Gogolek, executive director of the B.C. Freedom of Information and Privacy Association:
 “The process is opaque,” he says. “Even the information commissioner is not allowed to look at
it. So, no one can say, ‘Yes, they are applying this properly.’ ”

Also disturbing is Bill C-59—this year’s budget bill—which retroactively revised the ATIP law in
an effort to exempt all records for the defunct long-gun registry, from any form of request,
 including complaint, investigation, judicial review or appeal. The change was made as Legault
was poised to recommend possible criminal charges against the RCMP for withholding—and
later destroying—gun-registry documents. By backdating the ATIP law’s revision to October 2011,
 the change effectively rewrote history, a “perilous precedent,” as Legault put it, that could be
 used by governments to retroactively rewrite laws.

Treasury Board President Tony Clement did not respond to Maclean’s interview requests, but he
 has rejected criticism directed at ATIP: “We are the most open and transparent government in
 the history of this country, and we are darn proud of it,” he told the House of Commons earlier
this year
, noting that the current government has processed the most ATIP requests. “That’s
because they’ve received the most ATIP requests,” says Gogolek, “which is something they also
 use to blame the delays on.”

ATIP cases now clog the judiciary: The Federal Court of Appeal intervened earlier this year,
when a citizen seeking information on the sale of military assets was told it would take 1,100
days. In 2013, an Ontario court had to order the federal government to release thousands of
pages of documents detailing government involvement in residential schools withheld from the
Truth and Reconciliation Commission. “It really got in the way of the truth-telling,” says Cindy
Blackstock, executive director of the First Nations Child and Family Caring Society. ATIP delays
 affected a complaint her organization filed with the Canadian Human Rights Tribunal claiming
the federal government discriminated against First Nations children, she says. An ATIP she filed
 in November 2012 was received in April 2013, after hearings started. The hearings were
 postponed when it was discovered the government had withheld 90,000 pages of documents.

A pattern of disappearing information has raised questions about political interference, notably,
after the Canada Revenue Agency ordered employees to destroy all text-message records. The
concern is that the agency was covering up evidence of a crackdown on charities that opposed
government policy.

The census scandal drew international criticism. Government interfering in statistical gathering is
both unusual and unacceptable, says Denise Lievesley, social statistics professor and dean of
faculty at King’s College London and former director of statistics at UNESCO: “We were quite
 shocked when that decision was a political decision.”
(Kayla Chobotiuk)

(Kayla Chobotiuk)

As the government becomes increasingly opaque, citizens’ lives have become more transparent
 than ever before, says Brian Campbell, former head librarian at the Vancouver Central Library.
 “The decline in gathering social data about Canadians is occurring just as the government’s
ability to gather information about and monitor Canadians is unprecedented,” he says. Yalnizyan
 agrees. “Government is more intrusive and more coercive of information than we’ve ever seen,
with the passage of Bill C-51, and Bill C-377, which requires unions to publish information about
their leaders—not only how much money they make, but also what they do on the job, as well as
 their time off the job.” The Voices-Voix study documented more than 100 cases of the
government monitoring groups and individuals, among them civil servants, women’s groups,
human rights organizations and Indigenous organizations. Blackstock saw her professional and
personal life was monitored; in her case, the conduct was determined by the privacy
 commissioner to be in violation of the Privacy Act. “They were trying to discredit me, rather than
arguing the [child welfare] case on the evidence,” Blackstock says, noting 189 various officials
followed her movements: “Beyond being shocked and horrified, as a taxpayer, I thought, ‘What
a huge waste of money.’ ”

Tax money was also deployed to try to dissuade the L.A.-based Society of Brain Mapping and
 Therapeutics from giving Liberal MP Kirsty Duncan, a former scientist who sat on Al Gore’s
 Intergovernmental Panel on Climate Change, an award in 2012. Babak Kateb, a neuroscientist
at the Cedars-Sinai Medical Center in L.A., who is the chairman and CEO of the society, says
 he received a call from the Canadian consulate in L.A. discouraging him from proceeding with
recognizing Duncan. “I was shocked by the meddling,” Kateb told Maclean’s, adding that the
 government later participated in the society’s 2015 conference. “Your current government
 doesn’t know how to deal with science,” he says. “On one hand, they object to an award being
given to a champion of science, but then, years later, support the organization she is on the
board of. They don’t have a compass.”

Related reading from Paul Wells: Stephen Harper, friend of science, kind of

The vanishing of Canada has created a counterinsurgency—scientists, researchers, economists,
 civil rights groups, librarians and artists marshalling resources and their own time to monitor,
expose, protest and create a new literature of knowledge loss. Li, for one, has taken preservation
of national records into private hands by spearheading an effort with universities across the
 country dubbed LOCKSS—“Lots of copies, keep stuff safe”—to archive federal websites, an
exercise not unlike trapping fireflies in a jar: “Without that or a print record, there’s no way of
 tracking change.” After the government changed Crown copyright policy, guidelines for legally
 reproducing its documents in 2013, Li went online for the old copy. It had vanished. In July,
Evidence for Democracy launched True North Smart and Free, an interactive website
documenting seven years of changes to how science is collected and used in federal policy
decision-making.

Meanwhile, as actual information vanishes, it’s being replaced by mythologizing historical
 narratives. As stations monitoring climate change close in the Arctic, historic missions in the
North, notably, the Franklin expedition, are celebrated; at a time when veterans’-services offices
 have been closed and StatsCan no longer tracks military personnel, or wages and salaries of
 veterans, soldiers who fought historic wars are memorialized, with $28 million spent on the
 anniversary of the War of 1812, in one example.

Archival history is a casualty when a country is in a severe economic, military and political crisis,
 says Robin Vose, president of the Canadian Association of University Teachers. “Why do we
 need to let it fall victim in peacetime, when we’re an affluent society?”

Yet LAC did recently add to the national archives, making its first purchase this April in almost a
 year: a parcel of 19th-century paintings, illustrations and journal materials from the Peter
Winkworth collection of Canadiana. The single-most expensive purchase, at $46,750, was an
1883 oil painting of a fish hatchery in Newcastle, N.B. (part of Miramichi) by Edward Scope
Shrapnel. The irony is acute. We’ve lost data, trashed records and stopped monitoring vital
 aspects of fisheries and acquatic life. But we’ve gained an idyllic rendering of an ecologically
untroubled time that now serves as yet another indictment of what has been lost.

– With Zoe McKnight and Monika Warzecha

The story has been edited to clarify the fact that one third of residents in Melville, Sask. got the
NHS in 2011.