BLOG

See the latest news and insights around Information Governance, eDiscovery, Enterprise Collaboration, and Social Media. 

All Posts

The Art of Archiving Complex Modern Websites

You’ve probably heard of web crawlers (sometimes also called spiders or spiderbots). Much like real spiders scurrying around, these bots systematically scour the internet for website content. They discover and visit sites by following links in order to copy pages and process them.  

The most common use case for this is internet search. Search engines like Google use bots to index sites for its results pages. But there are many other reasons to crawl pages as well. At PageFreezer, crawling technology is used to archive customers’ websites. We regularly crawl sites to ensure that any change to any page is recorded and saved. Customers can then review these chronological versions of their websites through our dashboard, which allows them to replay any version as if it were live, and also instantly identify changes to a page.

How to archive modern websites 

Why Organizations Archive Their Websites   

Why is all this necessary? For many of our enterprise customers, it comes down to regulatory compliance. With industry regulators demanding the collection and preservation of all online communications, having an archiving solution in place is an absolute requirement. Failure to preserve records in appropriate, replayable, and verified formats can result in significant penalties.      

Want to learn more about the most advanced Website Archiving Solution available? Visit our product page, or click the button below the schedule a demo.

Schedule a Demo

 

Lawsuits, false claims, and eDiscovery responses are also common reasons. Should someone claim that a statement appeared on a company website that didn’t, the organization has to be ready to disprove the claim. In order to do this, evidentiary-quality records of website content is needed. PageFreezer provides evidentiary-quality records by delivering archive data that is timestamped and has a digital signature, so authenticity is guaranteed.

 

The Challenges of Archiving Modern Web Pages

While the idea of crawling websites is nothing new, the challenges of crawling many modern pages for archiving purposes is significant. Organizations like banks, for instance, have incredibly complex websites that consist of dynamic content, password-protected pages, personalized pages, and form flows. Capturing all of this accurately purely through crawling (without the need for complicated scripting or engineering) isn’t easy.

In fact, it’s a challenge that many archiving companies struggle with. A leading global financial services firm recently signed a contract with PageFreezer after concluding that we had the only web crawling technology capable of accurately capturing its highly-personalized dynamic website for eDiscovery and regulatory compliance purposes.

PageFreezer conducted an extensive two-year R&D program that resulted in the 3rd generation of our dynamic web capture technology. Thanks to this latest crawler, PageFreezer can capture client-side generated webpages by Javascript/Ajax frameworks, including Ajax-loaded content. It’s also capable of collecting multiple steps in web form flows, and can capture webpage content that is displayed after a user event (if a section on a webpage loads additional content using Ajax after a user clicks).

Ready to start archiving your website automatically? Request a quote by clicking the button below.   

Request a Quote

 

George van Rooyen
George van Rooyen
George van Rooyen is a Content Marketing Specialist at Pagefreezer.

Related Posts

Diversity, Equity, Inclusion, and Belonging (DEIB) at Pagefreezer

Back in June, Pagefreezer’s CEO Michael Riedijk posted our Stance on Racism. In that post we made the following promises:

More Legal Lessons Learned: 5 Times Social Media Evidence Was Denied in Court

Digital content (like web pages, Facebook posts, and tweets) is increasingly being submitted as evidence during legal matters—but it isn’t always being admitted by courts. As with any other form of evidence, digital evidence needs to meet a certain standard in order to be deemed admissible—and in many cases this comes down to how the evidence was collected and authenticated. If the collection and authentication process wasn’t handled correctly—and the method employed didn’t prove authenticity beyond any reasonable doubt—the evidence typically would not be accepted.

Why You Should Use SHA-256 in Evidence Authentication

In a previous article, we discussed why hash values are crucial in evidence collection and digital forensics. Following on from that, it’s worth discussing why Pagefreezer specifically makes use of the SHA-256 hashing algorithm when applying a digital signature to one of our records.