Schedule a Demo

BLOG

See the latest news and insights around Information Governance, eDiscovery, Enterprise Collaboration, and Social Media. 

All Posts

The Art of Archiving Complex Modern Websites

You’ve probably heard of web crawlers (sometimes also called spiders or spiderbots). Much like real spiders scurrying around, these bots systematically scour the internet for website content. They discover and visit sites by following links in order to copy pages and process them.  

The most common use case for this is internet search. Search engines like Google use bots to index sites for its results pages. But there are many other reasons to crawl pages as well. At PageFreezer, crawling technology is used to archive customers’ websites. We regularly crawl sites to ensure that any change to any page is recorded and saved. Customers can then review these chronological versions of their websites through our dashboard, which allows them to replay any version as if it were live, and also instantly identify changes to a page.

How to archive modern websites 

Why Organizations Archive Their Websites   

Why is all this necessary? For many of our enterprise customers, it comes down to regulatory compliance. With industry regulators demanding the collection and preservation of all online communications, having an archiving solution in place is an absolute requirement. Failure to preserve records in appropriate, replayable, and verified formats can result in significant penalties.      

Want to learn more about the most advanced Website Archiving Solution available? Visit our product page, or click the button below the schedule a demo.

Schedule a Demo

 

Lawsuits, false claims, and eDiscovery responses are also common reasons. Should someone claim that a statement appeared on a company website that didn’t, the organization has to be ready to disprove the claim. In order to do this, evidentiary-quality records of website content is needed. PageFreezer provides evidentiary-quality records by delivering archive data that is timestamped and has a digital signature, so authenticity is guaranteed.

 

The Challenges of Archiving Modern Web Pages

While the idea of crawling websites is nothing new, the challenges of crawling many modern pages for archiving purposes is significant. Organizations like banks, for instance, have incredibly complex websites that consist of dynamic content, password-protected pages, personalized pages, and form flows. Capturing all of this accurately purely through crawling (without the need for complicated scripting or engineering) isn’t easy.

In fact, it’s a challenge that many archiving companies struggle with. A leading global financial services firm recently signed a contract with PageFreezer after concluding that we had the only web crawling technology capable of accurately capturing its highly-personalized dynamic website for eDiscovery and regulatory compliance purposes.

PageFreezer conducted an extensive two-year R&D program that resulted in the 3rd generation of our dynamic web capture technology. Thanks to this latest crawler, PageFreezer can capture client-side generated webpages by Javascript/Ajax frameworks, including Ajax-loaded content. It’s also capable of collecting multiple steps in web form flows, and can capture webpage content that is displayed after a user event (if a section on a webpage loads additional content using Ajax after a user clicks).

Ready to start archiving your website automatically? Request a quote by clicking the button below.   

Request a Quote

 

George van Rooyen
George van Rooyen
George van Rooyen is the Content Marketing Manager at Pagefreezer.

Related Posts

SEC Rule 17a-3 & FINRA Records Retention Requirements Explained

Financial industry recordkeeping regulatory requirements like the U.S. Securities and Exchange Commission (SEC) Rules 17a-3 and 17a-4, and the Financial Industry Regulatory Authority (FINRA) Rules 4511 and 2210, play a crucial role in maintaining the integrity of the U.S. financial markets. These regulations are not just bureaucratic formalities; their oversight involves ensuring that financial services firms adhere to stringent record retention requirements, essential for the transparency, accountability, and trust that underpin the financial system.

The Reddit OSINT/SOCMINT Investigation Guide

According to its IPO prospectus submitted to the US Securities and Exchange Commission on February 22, 2024, Reddit has more than 100K active communities, 73 million daily active visitors, 267 million weekly unique visitors, and more than 1 billion cumulative posts.

Understanding a Request for Production of Documents (RFP)

Requesting production of documents and responding to requests for production (RFP) are key aspects of the discovery process, allowing both parties involved in a legal matter access to crucial evidence.