Difference between revisions of "Case study: Review of Environmental Assessment Processes"
Line 33: | Line 33: | ||
|- | |- | ||
| | | | ||
− | * Workshops. Workshops that focused on understanding stakeholder values. The Panel was also in attendance at these engagement sessions. | + | * '''Workshops.''' Workshops that focused on understanding stakeholder values. The Panel was also in attendance at these engagement sessions. |
− | * Panel presentations, also known as Town Halls. The Panel, accompanied by members of the secretariat, travelled to 21 cities to hear Canadians’ views and better understand the issues around environmental assessment. The public was invited to present their views to the Panel. | + | * '''Panel presentations,''' also known as Town Halls. The Panel, accompanied by members of the secretariat, travelled to 21 cities to hear Canadians’ views and better understand the issues around environmental assessment. The public was invited to present their views to the Panel. |
− | * Indigenous Dialogue Sessions. These sessions were opportunities to hear views from Indigenous peoples, and capture their unique challenges related to environmental impact assessment. The Panel spent 2 days in each of the locations; one session was allocated for public input, and two were dedicated to Indigenous peoples. | + | * '''Indigenous Dialogue Sessions'''. These sessions were opportunities to hear views from Indigenous peoples, and capture their unique challenges related to environmental impact assessment. The Panel spent 2 days in each of the locations; one session was allocated for public input, and two were dedicated to Indigenous peoples. |
− | * Request for comment. Views could be submitted through a dedicated email address, as well as mailed to CEAA. | + | * '''Request for comment.''' Views could be submitted through a dedicated email address, as well as mailed to CEAA. |
|| | || | ||
+ | * '''Online Portal.''' Used to display submissions and collect input. It was important to the team that all information received would be open and accessible, so there was no login requirements for the site. | ||
+ | * '''Choicebook.''' This online questionnaire was used to collect input. | ||
+ | * '''Event Brite.''' Used for invitations to regional events; the team found this to be a very effective tool as a centralized system to manage and record participation. | ||
|- | |- | ||
|} | |} | ||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
The team used online advertising to raise awareness for the consultation and encourage attendance at regional in-person events. Prior to travelling, the Expert Panel Secretariat asked local, regional and national organizations to post the details of engagement sessions on their respective websites and to promote participation via social media. To help further amplify the message, a media toolkit was created with a standard template and info bites for news bulletins. The Secretariat considered this process to be a key factor in achieving such a high turnout. | The team used online advertising to raise awareness for the consultation and encourage attendance at regional in-person events. Prior to travelling, the Expert Panel Secretariat asked local, regional and national organizations to post the details of engagement sessions on their respective websites and to promote participation via social media. To help further amplify the message, a media toolkit was created with a standard template and info bites for news bulletins. The Secretariat considered this process to be a key factor in achieving such a high turnout. | ||
While events were set up prior to travel, organizers tried to remain flexible. During city visits, if an unanticipated audience reached out with the desire to participate, the Secretariat organized ad hoc meetings with them during the trip. | While events were set up prior to travel, organizers tried to remain flexible. During city visits, if an unanticipated audience reached out with the desire to participate, the Secretariat organized ad hoc meetings with them during the trip. | ||
− | Eventbrite proved to be an effective tool for managing invitations to the in-person sessions and recording participation, as well as signposting where additional time for presentations might be necessary in some cities. Due to the overwhelming response from Canadians wanting to attend the in-person sessions, two kinds of tickets on Eventbrite were created: one for presenters and one for observers. Distinguishing these two groups facilitated advanced planning of events and ensured the appropriate accommodation measures were in place. | + | |
+ | Eventbrite proved to be an effective tool for managing invitations to the in-person sessions and recording participation, as well as signposting where additional time for presentations might be necessary in some cities. Due to the overwhelming response from Canadians wanting to attend the in-person sessions, two kinds of tickets on Eventbrite were created: one for presenters and one for observers. Distinguishing these two groups facilitated advanced planning of events and ensured the appropriate accommodation measures were in place. | ||
+ | |||
An email address in both official languages as well as a 1-800 phone number were set up to respond to questions, concerns, and provide another channel for people to share their views. A dedicated team was set up to respond to inquiries, with a service standard to respond within 48 hours. | An email address in both official languages as well as a 1-800 phone number were set up to respond to questions, concerns, and provide another channel for people to share their views. A dedicated team was set up to respond to inquiries, with a service standard to respond within 48 hours. | ||
− | + | ||
+ | ===Engagement=== | ||
The scope of the engagement was national; regional engagement sessions were held in 21 major cities across Canada. They were led by a four-member panel of environmental experts. The Panel heard 377 in-person presentations, including 128 Indigenous presentations. Workshops were also used to advance learning and they provided a forum to help people engage with complex subject matter. The format of the workshop was tested and iterated upon as they went along, which allowed staff to learn new techniques and provide participants with new opportunities to share perspectives as they emerged. | The scope of the engagement was national; regional engagement sessions were held in 21 major cities across Canada. They were led by a four-member panel of environmental experts. The Panel heard 377 in-person presentations, including 128 Indigenous presentations. Workshops were also used to advance learning and they provided a forum to help people engage with complex subject matter. The format of the workshop was tested and iterated upon as they went along, which allowed staff to learn new techniques and provide participants with new opportunities to share perspectives as they emerged. | ||
Revision as of 13:41, 21 March 2019
Overview
The scale of the EA review engagement was unprecedented for the Canadian Environmental Assessment Agency. They used a combination of methods and tools to seek out a broad range of views, including Eventbrite, choice book, facilitated workshops, a dedicated online portal, social media info bites and targeted outreach. The mandate of the Panel was clear, and articulated the scope of the engagement to the public and participants. Their review of the EA process was made as transparent and inclusive as possible; every comment received was published online, with connections to where it can be found in the final report and recommendations. Success came down to having staff with the right skill sets and experience, and the executives setting a clear example by speaking with people to demonstrate leadership and governance.
History
Introduction
A commitment to review the current act was in the mandate letter to the Minister of Environment. The public sentiment was that a new assessment process was needed to replace Canadian Environmental Assessment Act 2012, one that was grounded in evidence and established through consultation with Indigenous peoples and Canadians. Following initial consultations to develop recommendations on how the Environmental Assessment review would be conducted, it was decided that an independent review panel (“the Panel”) would be established. The subsequent consultations and reports produced by the Panel are the focus of this case study.
Why Engage?
The consultation was initiated with the goal to restore trust in responsible federal oversight of Environmental Assessment processes. To accomplish this, an arm’s-length, neutral, independent review panel led consultations that informed the development of recommendations for changes to the Environmental Assessment Act.
The consultation occurred between September 1st, 2016 and March 31st, 2017.
People and Context
Who was included
The Panelists, with support from the Canadian Environmental Assessment Agency, sought input from: academics, business and industry associations, the mining, quarrying, oil and gas extraction industries, the general public, Indigenous peoples, National Indigenous organizations (NIOs), non-profit organizations, experts in the natural resources industry, as well as provinces and territories.
Funding
2.5 million dollars of federal funds were allocated to the project through the Budget.
Goals and Objectives (Policy)
The objective was to overhaul the existing federal environmental impact assessment process by designing a transparent consultation process that brought stakeholders together to provide their input and expertise, and established ongoing partnerships with influencers and affected groups.
Methods and Tools
Methods and tools were chosen with the intent to ensure transparency throughout the entire process of consultation.
Methods | Tools |
---|---|
|
|
The team used online advertising to raise awareness for the consultation and encourage attendance at regional in-person events. Prior to travelling, the Expert Panel Secretariat asked local, regional and national organizations to post the details of engagement sessions on their respective websites and to promote participation via social media. To help further amplify the message, a media toolkit was created with a standard template and info bites for news bulletins. The Secretariat considered this process to be a key factor in achieving such a high turnout. While events were set up prior to travel, organizers tried to remain flexible. During city visits, if an unanticipated audience reached out with the desire to participate, the Secretariat organized ad hoc meetings with them during the trip.
Eventbrite proved to be an effective tool for managing invitations to the in-person sessions and recording participation, as well as signposting where additional time for presentations might be necessary in some cities. Due to the overwhelming response from Canadians wanting to attend the in-person sessions, two kinds of tickets on Eventbrite were created: one for presenters and one for observers. Distinguishing these two groups facilitated advanced planning of events and ensured the appropriate accommodation measures were in place.
An email address in both official languages as well as a 1-800 phone number were set up to respond to questions, concerns, and provide another channel for people to share their views. A dedicated team was set up to respond to inquiries, with a service standard to respond within 48 hours.
Engagement
The scope of the engagement was national; regional engagement sessions were held in 21 major cities across Canada. They were led by a four-member panel of environmental experts. The Panel heard 377 in-person presentations, including 128 Indigenous presentations. Workshops were also used to advance learning and they provided a forum to help people engage with complex subject matter. The format of the workshop was tested and iterated upon as they went along, which allowed staff to learn new techniques and provide participants with new opportunities to share perspectives as they emerged.
The in-person events enabled the Expert Panel to provide Canadians with engagement opportunities that were more closely tailored to different levels of awareness and sophistication with the subject matter. Oral presentations were more appropriate for those with existing knowledge of Environmental Assessment. Evening workshops provided an informational component and focused on learning from Canadians what their values are, to determine what is important for the future of Environmental Assessment.
The entire process was open to the public. All submissions were put on a website where they were accessible to both participants and the larger community, at the same time as the engagement sessions were occurring. Once the final report was compiled, the website was adapted to provide references to where each participant’s input could be found in the final report. The full report is available online. These practices reflect the overarching objective to be transparent and ensure that public input is reflected in the Panel’s recommendations.
Throughout the consultation, the team encountered some challenges, including comprehension difficulties due to the breadth of issues covered, and the complexity of the subject matter. The workshops focused on two-way information sharing to support a heightened technical understanding of environmental assessment processes among participants, who could use that knowledge to formulate their submissions to the Panel. It was discovered early on that the 15-minute limit on presentations was too short, especially with Indigenous groups. The Expert Panel and engagement facilitators quickly re-thought the approach and decided to eliminate the time limit. Funding was made available to support the participation of Indigenous peoples and the public to travel to the in-person sessions. The team attributes the consultation’s success to the Panel’s dedication to attend every session in person and engage with participants.
6 Analysis It took 6 weeks to review all the submissions and data they received. The dynamic web function of the sites and final report was seen as a success. All submissions were made available online, with links so that participants can refer back to what they said, and see where their input can be found in the final report. The CEAA staff constantly monitored submissions that were coming in to the website. 7 Communicating back The team committed to responding to all submissions received, through phone call, by letter or email. This ensured participants that their feedback was received and their voices were heard. The final report was published online in full HTML format, with downloadable PDF option. They shared all the materials received online, including emails and written submissions. Transcripts from formal presentations and materials submitted during the in-person sessions were also made available online, as well as summaries of what was heard in each city. Comments were collected and made into an online compendium to show Canadians how their input and their recommendations are reflected in the Panel’s report. In the evaluation process post-consultation, feedback from the survey regarding process and experience was overwhelmingly positive. The CEAA team is putting in place a proposal to maintain ongoing stakeholder relationships.