Meeting Report

I. BackgroundII. Event overview  |  III. Session summaries  |  IV. Moving Forward

 

Session summaries

 

 

Defining quality and minimal quality standards for AVMS - Group Work

Participants were then split into four groups to further discuss the definition of quality and minimal quality standards. The group work sessions aimed at generating thinking on a) principles of ‘quality’, b) tools required to support the work of AVMS, c) costs-benefits/value-for-money considerations applied to AVMS, and d) operational considerations in relation to the broad area of generating data and analysis and using data and analysis for armed violence reduction (AVR) policies and programming.

Ideas emerging from the group work included:

a.      Quality principles

  • Consultation is required with both within AVMS teams and with external actors. The process is vital at different stages of data collection. Some participants highlighted that when the data were collected from the community, one should empower the community by providing them not only with ownership of the data, but also sharing the results with them. This strategy helps sustain the coordination between the observatory and the community at the local level. Ensuring the ethical dimension of both the collection and use of data was also recommended.
  • Coordination across different institutions is as a key point to ensure data clarity, consistency, and accuracy. It is necessary to harmonize the methodology, to rely on clear concepts, and to use the same references to improve data quality. Some participants also pointed out situations where the state could not provide data. In such situations, other sources of information become the main providers of data. However, also for these actors there is the need for clear definitions, coordination, and unified methodology.
  • Transparency is necessary throughout the entire process, by acknowledging the limitations what is being done, performing effective reporting, and conducting follow-up.
  • Ensuring breadth of coverage. Where possible, it is important to include a range of forms of violence such as gender violence, child violence, organized crime, and armed violence.
  • Quality vs. quantity. Quality also means the will to assess the standards and the tools used by the AVMS. Participants also felt that it is important to prioritize what exists in terms of data and improve it quality, making best use of limited available resources. Clarifying the goals and scope of an AVMS can be an effective exercise when considering quality standards principles.
  • Taking into account issues of privacy is crucial to avoid causing harm once the information is made public.

b.       Tools required to support the work of AVMS

  • Consultation is a tool in itself that should be maintained both internal to the team and with outside actors and stakeholders.
  • Manuals can be important tools in knowledge sharing, adopting the data, discussing problems, and sharing solutions.
  • Training is a key tool. Important considerations include who does the training, who is the training for, and, importantly, what will the follow up be after the training. Training can be needed at different stages (with special efforts at an early stage). Developing capacities within the police academies in South Sudan was cited as a positive example.
  • Other important tools include relevant and accurate data collection methodologies and coding techniques.

c.       Costs-benefits/value-for-money considerations

  • The better the quality of data, the more efficient institutions can be. More information about what is happening leads to more ways to solve problems.
  • Providing policy-makers with timely and relevant information, including informing them where to find data, will improve the perception of the institutional quality. This way of doing business also can attract interested donors.
  • Since generating new data and information is costly, start with combining what is already available.
  • When developing AVMS, it is necessary to identify which institutions have the capacity to support the initiative hence leading to a reduction  in costs. For instance, being assisted by a university could help reduce some of the costs.
  • AVMS need long-term funding cycles.
  • Investing more at the beginning of the project more likely to produce results.

d.       Operational considerations

  • It is essential to involve the right people from the beginning to best identify problems.
  • Getting people to share information is more difficult than one thinks; sometimes there is no access to politically guarded databases. There may be little point in setting up an observatory if the government is actually collecting relevant data but refusing access to those data.
  • In terms of standardization and adaptability, it is crucial to make the information relevant to the local setting, as the location of the observatory will have implications on its fundraising and its operationalization.
  • Think of observatories as multi-actors platforms.

From the working groups the following issues emerged:

  • Partnership is a crucial element in the work of AVMS/observatories. For instance, in Jamaica the local observatory helped strengthen trust among different actors, between government and communities. The observatory there organizes monthly meetings with different stakeholders and actors, including those facing violence. Such meetings have been successful in solving issues. There is a need to think carefully about the inclusion of possible new stakeholders, such as businesses and private sector companies, but also to remain open about actors coming from outside the violence and crime contexts.
  • New technology can be seen to help in producing actionable information. Whether observatories are able to use all of the information generated by evolving technologies was raised as a question. The topic stimulated a lively discussion among participants: some noting its important role and potential while others considering it as a tool that cannot provide solutions by iteself. For some, technology—including GIS—has been successful in providing actionable information; for example, information about hospital access and different services helped in solving many access problems. Information technology may help in understanding the “micro-climate” among different variables in different places. The issue of the use of ‘new technologies’ in armed violence observatories will need to be addressed with more details in the future.
  • Funding.  It was noted that the first five years of the project life-cycle are crucial to the sustainability of a project. However, there is a need to know who external donors are at the beginning of a project but also where they might be after five years.  For institutions, it is important to understand the logic of the donors. Often, donors may provide funding only one year at a time and sometimes would limit their support to a maxium of three years.  From a business perspective, the possibility of the observatory to charge fees for services was also raised. Some observatories have the idea to sell data; however, many issues were raised on this matter, including credibility, control, and dependency.
  • Terminology. The definition of observatory still remains problematic. Various actors and organizations use the term observatory, but they mean different things.  Most participants agreed that a detailed discussion about this issue was necessary, especially because different definitions result in developing different policies.
  • Indicators were identified as critical to the success of observatories: Relevance, useful, partnership, sustainable, good leadership, reduction, systematic, money, impact, credibility, innovation, ownership, integrity, capacity, disappear, prevention, resilience, cohesion, honesty, advocacy, political will, neutral, accountability, independence, security, and privacy were enumerated as key words. It was noted that safety was absent from the list, but that security was perhaps an acceptable proxy for safety.