MarTech Interview With Ali Habibzadeh, Chief Technology Officer, Deepcrawl

Ali Habibzadeh, Chief Technology Officer, Deepcrawl highlights a few importance fundamentals that B2B marketers need to focus on when it comes to website health maintenance:

_________

Welcome to this MarTech Series chat, Ali, tell us about yourself and your journey through the B2B tech market, we’d love to hear more about your role at Deepcrawl…

Thanks for having me…well, my journey through the tech industry has been quite varied! After graduating with a master’s in user experience design, I set my sights on a career in web and application development, focusing at first on front-end infrastructures. At that point, I didn’t have a specific industry sub-sector in mind, so I thought I’d try my hand at whatever piqued my interest.

Over the years, I’ve worked in numerous sectors, including developer roles across social networks, genealogy, law enforcement tech, and local search. Eventually, I landed in the world of advertising and MarTech.

I’ve always had a passion for development and the whole software development lifecycle, but it was in advertising and MarTech that I found the most satisfaction as a user experience professional. In many of my roles, I did user research, did the design, developed the prototype, and did the final implementation—so I can say that my core interest is making usable software. After almost four years in the B2B MarTech space at Concep, I was ready to leap into my next challenge and took on the head of user interface role at Deepcrawl, later becoming CTO.

I was initially brought on as a contractor to consult on the project of rolling out Deepcrawl’s V2 API and UI. We were a small team of developers, so when my design consultancy was finished I moved on to implementing the new application and, after that, helping with the API. Since then, I have worked on many of our backend services, including our GraphQL API, website crawler, and data processing pipeline.

As Deepcrawl’s CTO, my job varies a lot day to day. I manage and organize a very talented group of engineers who work across many teams to expand our market-leading website intelligence platform that proves crucial to our customers’ online performance.

We’ve grown considerably in the time that I’ve been here. Even after seven years, I can confidently say that no two days are the same. Working under our new CEO, Craig Dunham, I’m proud of the work we have all achieved—particularly in overcoming the challenges posed by the onset of the pandemic and posting a positive period of growth in the years following.

We’d love to hear more about the recent enhancements to Deepcrawl’s website crawler, how does this help end users?

Few Deepcrawl innovations over my tenure have excited me as much as our new crawler. Not only have we managed to produce the fastest website crawler available on the market today, but we’ve also made it more robust for enterprise use than ever before.  Our crawler is now capable of crawling at speeds of up to 450 URLs per second for non-rendered pages and 350 URLs per second for JavaScript-rendered pages. This means our users have a crawler that is always up to the task, even if their websites are enterprise-scale with millions of pages.

Deepcrawl users can monitor their websites’ technical health extremely efficiently, saving considerable time for their website teams who are then freed up to focus on important implementation and strategic work. The enhanced crawler puts the power of speed in the users’ hands — any potential site issues can be flagged sooner, helping digital teams resolve SEO and site performance problems before they become critical, traffic-draining mistakes that could impact the wider business. It’s about empowering our users to stay fully in control of their website’s technical health and performance, and freeing up more time for SEO implementation and improvements.

Marketing Technology News: MarTech Interview With Laura Bassett, Vice President, Product Marketing at NICE CXone

For marketers who use these kind of platforms to improve website optimization processes, what practices should they be following to ensure better/quicker results?

Of course, the recommended crawl speed for any given site is very much is dependent on the scalability of the client’s web server, so it’s also important for enterprise businesses, in particular, to invest in their server infrastructure to ensure it supports their site performance and important crawling efforts.

Digital teams should address the technical aspects of their site and server to stay up-to-date with best practices, looking at things such as having a good caching policy in place, load balancers, higher-grade instances that suit their day-to-day customers’ needs, a good use of proxies, etc. These are just a few of the techniques that can be used to improve the responsiveness of a server.

How are you seeing the demand and need for crawling software change in today’s B2B marketplace and what immediate predictions do you have here?

As the digital landscape becomes more saturated with competition and consumer expectations for high-quality digital experiences expand, a business’ website has never been a more important asset. Also, with more user demand for privacy and the decline of third-party cookies, the digital advertising landscape is changing — which means a lot of marketers are starting to focus more on organic channels, like their websites. More and more, digital leaders are coming to understand the importance of search engine optimization as a core part of today’s marketing best practices. At Deepcrawl, we like to say we’re in a search-first age. Whenever you need something or have just about any question at all, the first place most of us go to is to Google. It’s a good time to be in the SEO space.

Websites are also more complex than they used to be. There’s been a proliferation of JavaScript use on sites in recent years, which has the potential to cause rendering issues for search engines. It’s important to monitor your site for these kinds of technical issues to make sure they aren’t holding you back in the search engines and potentially depriving your site of opportunities for traffic and growth.

These days, enterprise businesses and listings websites can have millions of URLs on their sites. That’s a massive scale — and those very large sites need tools that can cope with that scale. That’s why we’ve been so focused on creating a platform that has the speed and flexibility to handle a huge amount of website data — and can process that data quickly.

In terms of predictions, I think that SEO and website health are only going to grow in importance. Between new restrictions in paid marketing channels like online advertising, recession concerns, and changing consumer demands, it makes sense that businesses are focusing on potentially more cost-effective, scalable organic channels, like organic search.

That’s why we’re committed to building solutions to support digital marketing leaders in these efforts and working to simplify the website optimization process for cross-functional teams—regardless of how big a brand’s website might be. In addition to the new crawler developments, we’ve also recently released Monitor Hub at Deepcrawl, which helps really streamline site monitoring and get digital teams on the same page with multi-domain dashboards, customizable high-level views of the metrics that matter most for any given team, and tailored alerts to help SEOs and developers quickly address issues as they arise.

Marketing Technology News: Why Is Email Production So Hard? And What You Can Do About It

Five thoughts on what it takes for B2B teams and B2B marketers to get their sites to stand out in a crowded digital market!

1. First, you need the right resources for the job. If you don’t have the tools and people in place to monitor and optimize your site, you’re likely going to miss out on opportunities for search-driven growth and revenue. There are a lot of benefits for brands that get SEO right — however, it’s a big task and often requires involvement from multiple departments, particularly on large sites. There are a lot of great SEO tools and platforms out there, but at an enterprise level, it becomes more difficult to find the solutions that can handle the scale of very large websites. Be sure to consider the speed, flexibility, collaboration, and knowledge-sharing features of the SEO and website platforms you’re evaluating. Can a given platform efficiently handle your site’s size and large datasets? Can it help remove bottlenecks by bridging cross-functional website teams across marketing, engineering, product, and UX departments with a centralized source of shared information and robust reporting options?

2.Don’t overlook the technical side of SEO. There is little point in creating fantastic website content if no one can find it! But search engine optimization today is about a lot more than just keywords. Addressing technical SEO is really a foundational aspect of optimizing for search. Sometimes, digital marketers can end up focusing entirely on keywords and backlinks, at the expense of improving the core technical factors that support SEO, improve user experience, and ultimately, help enable conversions and revenue. Tech SEO can seem intimidating at first glance, but having the right platforms and processes in place can help simplify and streamline these efforts. Yes, you need to create great website content and think deeply about keywords. But to really succeed in search in 2022, you also need to address the underlying technical foundations of your website — things like site structure, site speed, internal linking, JavaScript rendering, using the correct canonical tags, and crawling and indexing directives.

3. Focus on user experience. This benefits your brand’s overall customer experience efforts, but it also directly contributes to SEO and can play a big role in securing conversions online. Last year, Google signaled that UX is increasingly important to its search algorithms with its announcements around Core Web Vitals. Ultimately, search engines are trying to serve their users. Website teams should also aim to serve users and provide a great user experience — it’s great for your brand and customer loyalty, and it also helps keep your SEO strategies in line with the search engines’ algorithm updates.

4. Don’t overlook mobile. These days, Google primarily looks at the mobile versions of websites for indexing and ranking purposes. And, since 2016, Google has reported that more than half of web traffic comes from mobile devices. Still, many businesses’ websites overlook UX, functionality, and speed on the mobile versions of their sites. With a majority of site traffic coming in from mobile, approaching SEO with an eye toward mobile optimization can aid search ranking and simultaneously have a big impact on user satisfaction — and revenue. (Deloitte has reported that even a 0.1 improvement in site speed corresponded to an almost 10% increase in retail consumer spending.) Mobile matters a lot!

5. Empower your website teams to work collaboratively. A brand’s website is an important asset for the business, and it’s a shared project. You’ll likely have people contributing to your website’s success across multiple departments — SEOs, content writers, marketers, developers, UX designers, and product managers all have a role to play. Make sure you’ve got the internal processes and platforms in place to support their collaboration. This might require a change in your operational thinking to encompass broader project categories, not just individual departments. At Deepcrawl, we’ve called it a shift to Digital Ops, or a more expansive approach to digital operations, rather than keeping resources siloed within individual departments, as in Marketing Ops or Dev Ops, for example. Website teams can gain a lot in terms of time saved and goals met when they are better equipped to share data and knowledge across departments and have platforms that can serve as a centralized ‘command center’ for website intelligence and project management.

Some last thoughts, takeaways, before we wrap up!

It’s a really exciting time to be working in SEO. And it’s a really exciting time to be at Deepcrawl, in particular — it’s been great to watch the team grow over the years and work toward a shared vision to build the most scalable and flexible information retrieval and processing platform on top of the web. For those in the MarTech space, we’ll have even more announcements coming in the area of website intelligence very soon—stay tuned!

ClickZ: Join Deepcrawl and ClickZ to explore Technical SEO | Milled

Deepcrawl is a website intelligence platform — a ‘command center’ for website technical health. It helps businesses scale their digital operations by bringing together the teams, data, and insights required for high-performing, revenue-driving websites. Powered by its world-class web crawler, Deepcrawl exposes technical and structural issues that exist in—or are about to be introduced to—your site and helps you prioritize and fix them. Take command of your website’s health with Deepcrawl and realize your website’s full commercial potential.

Ali Habibzadeh is the Chief Technology Officer at Deepcrawl.

Missed The Latest Episode of The SalesStar Podcast? Have a quick listen here!

Episode 130: Digital Marketing and Ad Best Practices with John Piccone, Regional President, U.S, Adform

Episode 129: Regional Sales Development Best Practices with Abdul Javed, VP of Sales for APAC at Clari

Episode 128: Marketing Tips and Best Practices for the Rest of 2022 with Daniel Rodriguez, CMO at Simplr

 

 

Picture of Paroma Sen

Paroma Sen

Paroma serves as the Director of Content and Media at MarTech Series. She was a former Senior Features Writer and Editor at MarTech Advisor and HRTechnologist (acquired by Ziff Davis B2B)

You Might Also Like