Part 4 of 4 – Conclusion: SLD Blocking Is Too Risky without TLD Rollback

ICANN’s second-level domain (SLD) blocking proposal includes a provision that a party may demonstrate that an SLD not in the initial sample set could cause “severe harm,” and that SLD can potentially be blocked for a certain period of time. The extent to which that provision would need to be exercised remains to be determined. However, given the concerns outlined in Part 2 and Part 3 of this series, it seems likely that there could be many additions (and deletions!) from the blocked list given the lack of correlation between the DITL data and actual at-risk queries.

If the accumulated risk from non-blocked SLDs were to become too large, it could become necessary for ICANN to withdraw the entire gTLD from the global DNS root. Changes to the DNS root, once properly approved and authorized, can be implemented rapidly by updating the root zone file and notifying root server operators that a new zone file is available. This part of the process is as straightforward for deletions as for additions.  The approval and authorization process, however, would need to be much faster for a deletion than it currently is for an addition because of the urgency of making the change or “rollback” after a determination was reached that a gTLD’s delegation needed to be revoked. The importance of rapid delegation is affirmed in Recommendation 3 of SAC062:  Advisory Concerning the Mitigation of Name Collision Risk, published Nov. 7 by ICANN’s Security and Stability Advisory Committee (SSAC):

Recommendation 3: ICANN should explicitly consider under what circumstances un-delegation of a TLD is the appropriate mitigation for a security or stability issue. In the case where a TLD has an established namespace, ICANN should clearly identify why the risk and harm of the TLD remaining in the root zone is greater than the risk and harm of removing a viable and in-use namespace from the DNS. Finally, ICANN should work in consultation with the community, in particular the root zone management partners, to create additional processes or update existing processes to accommodate the potential need for rapid reversal of the delegation of a TLD.

For similar reasons, the DNS resource record TTLs for a new gTLD needs to be managed carefully to minimize residual effects that may occur should a problematic TLD delegation be removed.


ICANN’s proposal to accelerate the delegation of new gTLDs that implement SLD blocking misses the point of risk mitigation for name collisions. First, it relies on DITL data sets that are not statistically valid for determining whether an SLD is at risk. Second, it overlooks the qualitative analysis that is necessary for determining whether an installed system is at risk. And third, it lacks the failsafe of a rollback capability that is a necessary precaution in case the aggregate risk reaches a point that is otherwise unmanageable.

These factors argue for a more focused evaluation of SLD blocking before it is widely adopted. They also point to the importance of understanding how installed systems are likely to be affected by changes in the global DNS, and the effectiveness of different risk mitigation techniques. The complexity of this task should not be underestimated, as the recently published SAC062 makes clear in its analysis of the benefits and risks of “trial delegation,” which has been proposed as a way to understand the impact a full delegation might have.

To accelerate the applied research that will add to this understanding in these and many other areas, Verisign Labs is organizing a new Workshop and Prize on Root Causes and Mitigation of Name Collisions (WPNC) in early 2014. The call for qualitative analysis remains vital, and even more so if the changes to the global DNS are accelerated without fully understanding their impact.

Additional posts in this series:

  1. Part 1 of 4 – Introduction: ICANN’s Alternative Path to Delegation
  2. Part 2 of 4 – DITL Data Isn’t Statistically Valid for This Purpose 
  3. Part 3 of 4 – Name Collision Mitigation Requires Qualitative Analysis 

Burt Kaliski

Dr. Burt Kaliski Jr., Senior Vice President and Chief Technology Officer, leads Verisign’s long-term research program. Through the program’s innovation initiatives, the CTO organization, in collaboration with business and technology leaders across the company, explores emerging technologies, assesses their impact on the company’s business, prototypes and evaluates new concepts, and recommends new strategies and solutions. Burt is also responsible for... Read More →