New platform lets people make money leaking confidential files
A team of cryptographers and developers want to create a website where anyone can sell data sets to the h

Strategic Technology Leader | Customers Virtual CTO | Salesforce Expert | Helping Businesses Drive Digital Transformation
New platform lets people make money leaking confidential files
A team of cryptographers and developers want to create a website where anyone can sell data sets to the h
Blockchain could solve Data Integrity problems
As the world relies more heavily on data as the basis for critical decision-making, it is vital that this data can be tru
All this availability of data connection makes me feel nostalgic
When I first started my journey online I was still on dial-up. Once every while, I listened to the whi
AWS’s S3 outage
Martijn Veldkamp
“Strategic Technology Leader | Customer’s Virtual CTO | Salesforce Expert | Helping Businesses Drive Digital Transformation”
March 1, 2017
Tuesday’s Amazon Web Services mega-outage affected not only websites big and small, by disrupting their backend storage, but also a lot of apps and Internet of Things gadgets relying on the technology. The AWS storage offering provides hosting for images for a lot of sites, and also hosts entire websites, and app backends including Nest.
In fact, the five-hour breakdown was so bad, Amazon couldn’t even update its own AWS status dashboard: its red warning icons were stranded, hosted on the broken-down side of the cloud.
The S3 buckets in the US-East-1 region became inaccessible at about 0945 PST (1745 UTC) taking out a sizable chunk of the internet as we know it.
AWS has many many regions, and US-East-1 is just one of them. Developers are supposed to follow Disaster Recovery architecture and Best Practices and spread their applications over different data centers.
For various reasons – from the fact that programmers find distributed computing hard to the costs involved – this redundancy isn’t always coded in. And so here we are.
Having fun with Banano Nano
Stop waiting for “Perfect Data”!
Martijn Veldkamp
“Strategic Technology Leader | Customer’s Virtual CTO | Salesforce Expert | Helping Businesses Drive Digital Transformation”
November 21, 2025
It’s never getting off the couch! The single greatest killer of innovation is not bad preparation but poor administration. Innovation projects are uniquely vulnerable to this because they are not simple IT upgrades, they are fundamental changes to business processes.
The Hype Trap: The 95% of failures are often “hype experiments.” They start with a some flashy tool (like a generic AI chatbot) and go in search of a problem. Rather than the other way around. They stall because they have no clear owner, no defined ROI, and no integration into the actual workflows where you know, actual people do their jobs.The 5% Success: The 5% that get traction ignore the marketing hype. They are the unglamorous, high-return areas like back-office automation. They succeed because they are domain-specific (e.g., an AI that only reads lease agreements) and deeply integrated into a specific workflow.
Just do it!
There’s a common belief that AI initiatives require perfectly clean, structured and in-shape data before you can even begin. This is a form of procrastination. Let’s start tomorrow! Data is the ultimate couch potato, it will never get in shape on its own.
Waiting for Perfect: Companies that wait for a perfect, company-wide data strategy will be waiting forever. As one report on why AI projects fail notes, Garbage in, Garbage out is still a primary obstacle, leading to projects getting stuck in endless data-wrangling phases.The Start Now approach: Successful teams, adopt a pragmatic approach. They don’t wait. One manufacturing project, for example, saw a double-digit accuracy jump not from a better model, but by simply constraining the first version to SKUs that had at least 18 months of (imperfect) historical data. They started with the data they had, proving value, and built momentum from there.
Innovation fails not from bad prep, but from hype and hesitation. Start with a real workflow, use the messy data you already have, and build momentum. The unsexy projects are the ones that will actually drive benefits.
Stolen from https://static.vecteezy.com/
From Blueprint to CTO
Martijn Veldkamp
“Strategic Technology Leader | Customer’s Virtual CTO | Salesforce Expert | Helping Businesses Drive Digital Transformation”
February 14, 2025
Remember those old cartoons where the architect with a rolled-up blueprint and a pencil behind their ear, stood stoically overseeing construction?
That image, while nostalgic, is as outdated as dial-up internet. The role of the architect has undergone a dramatic transformation, evolving from a technical specialist to a vital business strategist.
Beyond Blueprints
In todays fast-paced digital world, architects are no longer just drawing up technical plans. They’re navigating a complex application landscape of cloud computing, artificial intelligence, data analytics, and ever-shifting business needs.
The blueprint is still important, but its now just one piece of a much larger puzzle.
Think of it like this: Imagine building a house. Architects used to focus primarily on the structural integrity, ensuring the walls wouldn’t fall down. Now, you also need to consider energy efficiency, smart home integration, and how the house will adapt to the family’s changing needs over time. The modern architect is concerned with the entire ecosystem over time, not just the foundation.
Captain Sparrow fan art
Business-Savvy! As an architect you need to understand the business inside and out. How to translate business goals into technical solutions, and vice versa. You need to be fluent in the language of both the C-suite and the development team.
This means understanding market trends, competitive pressures, and how technology can drive innovation and create a competitive edge.
Its no longer enough to simply design elegant systems. The architect must also consider the business impact of their decisions. Will this architecture enable faster time to market? Will it reduce costs? Will it improve customer experience? These are the questions that keep CEOs awake at night, and architects need to have the answers.
The Architect as a Change Agent
And not a weapon of mass confusion
You are also playing a crucial role in driving digital transformation. Helping businesses adopt new Salesforce technologies, migrate away from their legacy systems, while embracing agile methodologies. This requires not only technical expertise, but also strong leadership, communication, and change management skills.
The architect is no longer just a technical expert; they are a change agent, helping organizations navigate the complexities of the digital age.
The Customer CTO
The days of the lone wolf architect working in isolation are long gone. Todays architect is a collaborator, working closely with developers, business analysts, security experts, and other stakeholders. You need to be able to build consensus, negotiate compromises, and foster a culture of collaboration.
The modern architect understands that the best solutions are often the result of diverse perspectives and shared knowledge.
That is why I think that CTO can mean much more then just Chief Technology Officer:
Collaborative Transformation Officer, Consulting Technology Orchestrator, Connector of Technology & Organizations, Catalyst for Technology & Operations
I wanted to be even more poetic but that is quite hard with just these letters. I also kinda like the following, but I’m ending the Friday with a beer and a blog, so bear with me:
Weaver of Technology & BusinessCurator of Technological FuturesInnovator of Technological Approaches
To state some open doors
The world of technology is constantly evolving. New technologies emerge, old technologies become obsolete, and best practices are continuously being redefined. You as an architect must be a continuous learner, always seeking to expand their knowledge and stay ahead of the curve.
This means attending conferences, reading industry publications, experimenting with new technologies, and engaging with the broader architectural community. Personally I mostly have several authors that I really like that help shape my thoughts. Martin Fowler has a nice group of people that use his platform to distribute interesting ideas.
To try and close the article. Organisations will need people that can act hybrid, blending technical expertise with business acumen, leadership skills, and a passion for innovation. They will be the CTOs of the future, shaping the digital landscape and driving business success.
What skills do you think are most important for the modern architect? Id love to hear your thoughts in the comments below. Also please find some more interesting finds on CTO!
Bedrock picture from DuckDuckGo and not a Minecraft one
Data Governance as the Bedrock of Effective AI Governance
Martijn Veldkamp
“Strategic Technology Leader | Customer’s Virtual CTO | Salesforce Expert | Helping Businesses Drive Digital Transformation”
November 3, 2023
As an organisation you need a plan to address the market disruption of Generative AI. You don’t need to build your own version of ChatGPT. But you need a plan on how your organisation will deal with all the initiatives that will start. Otherwise I wish you good luck with the conversation you will have when one of your CxO comes back from some partner paid conference stating that the company will be bankrupt if you don’t invest right now.
In this series of articles I felt the need to explore some of my current thinking on where Generative AI has it’s place.
Business Braveheart
In this ever-renewing push of the newest flavour of technology, the fusion of architecture, governance, and data governance stands as the cornerstone for reliability.
As organisations navigate their discovery of the complex realm of artificial intelligence, it becomes increasingly apparent that the success of implementing one of these LLMs (ChatGPT, Bard or Bing AI) is deeply entwined with the quality, security, and integrity of the data that they need and produce.
Effective AI governance is not just about fine-tuning algorithms or optimising your LLM models. It begins with the bedrock of quality.
Garbage in, garbage out
Feedback loop
It’s the quality, accuracy, and reliability of the input data that dictates the usefulness of AI’s output. Thus, a holistic approach with a very strong foundation in data quality and it’s governance. And remember, the prompts that you use for getting results is also data that needs to be governed. How else will you establish a feedback loop on effective usage of the tool?
The Interdependence of Data Governance and AI Governance
Data governance, as I’ve stated in the previous blog posts, primarily concerns itself with the management, availability, integrity, usability, and security of an organisation’s data.
AI in any form, by its nature, operates as an extension of the data it is fed. Without a sturdy governance structure over the data that you produce, AI governance becomes a moot point. On another note I’m still surprised nobody came up with an AI that generates cute cat short clips for a Youtube channel. Wait, I’m on to something here…
Quality Data: The Lifeblood of AI
A key aspect is that the quality of data isn’t an isolated attribute but a collective responsibility of various departments within an organisation. We all know that, but where does the generated data sit?
In the past I wrote about System Thinking and I still have to plot for myself where Generative AI sits. Is it like our imagination? Where do you Master the data LLM generates for you? Can I re-generate reliable? What happens with newer generated outcomes? Are these better then the old? Is the generated response email owned by the Service Department or the AI team? These articles are as much for you as for me to fully grok where LLM and it’s outcomes sits in the system.
Security and Ethical Implications
Privacy concerns, compliance with regulations, and ethical considerations in handling and processing data are pivotal components of data governance. As AI systems often deal with sensitive information, ensuring compliance with data protection regulations and ethical use of data becomes a critical component of AI governance. The same goes for the outputs. Where are they used or stored? How do these different data providers compare? The question that popped up in my head was that within the Salesforce ecosystem we use a lot of Account data and have linked with third party providers. We enrich the data that we have of the customer with Duns & Bradstreet information or in the Netherlands with the KVK register. What happens with the ‘authority score’ if we add Generative AI in the mix? We still have a lot to discover together.
Keep it simple
Keep calm meme-o-matic
In short because I harped on it before. Organizations should:
Establish Comprehensive Data Governance Frameworks: Institute clear policies for data ownership, stewardship, and data management processes. This not only fosters quality but also ensures accountability and responsibility in data handling. Promote Cross-Functional Collaboration: Break down silos and encourage collaboration between various departments. Not just good for data quality but many more aspects in life. Leverage Automation for Data Quality Assurance: Harness the power of automation tools to identify anomalies and inconsistencies within data, ensuring high-quality inputs for AI models. Ever did a large migration from one system to another? Right, automation for the win!Continuously Monitor and Improve Data Governance: Implement systems for ongoing monitoring of data quality. We have a Dutch expression which translated goes something like “The Polluter pays”. Bad data has so many downstream effects that I almost want to advise to have a monthly blame an shame high light list. Let’s forget about that for now. I do however want to stress a carrot and stick approach.
Conclusion
In my subsequent articles, I’ll try to delve deeper into the practical strategies and steps organisations can adopt and make it more Salesforcy.
Borrowed from https://blog.kore.ai/conversational-ai-top-20-trends-for-2020
AI Governance
Martijn Veldkamp
“Strategic Technology Leader | Customer’s Virtual CTO | Salesforce Expert | Helping Businesses Drive Digital Transformation”
October 4, 2023
AI governance is crucial to strike a balance between harnessing the promised benefits of AI technology and safeguarding against its potential risks and negative consequences.
It’s a very interesting and still an emerging problem of how to effectively govern the creation, deployment and management of these new AI services. End to end.
Heavy regulated industries, such as banking or public services such as tax agencies, are legally required to provide a level of transparency on how they operate their IT. And this is also true for their AI models. Failure to offer this transparency can lead to severe penalties. AI models, like their predecessors algorithms, can no longer function as a mystery.
The funny thing is that AI and it’s Governance is a hot topic, but Data Governance…? That ranges from boring to “Wasn’t that solved already?”.
It’s still all about that data
The real challenge is always the data. If you think about it, AI is about what data did you train it on, and on what do you want to use it, when? So AI Governance is not just the algorithms and the models. It starts with Data.
It’s no longer enough to secure your data and saying you will comply with privacy laws. You have to be, verifiable, in control of the data you are using. Both in and out of the AI models you are using.
It’s source, it’s provenance and it’s ownership. What rights do you have? What rights does the provider of that data have? We are not even beginning to scratch the surface of how do you even enforce those rights.
When planning to use AI ensure your Data is accurate, complete, and high quality
And this is starts at the collection of data. Does it provide accurate information? Am I missing data? Is the source reliable, timely and high quality? How will we measure and asses if the data collection is working as intended?
Human error is one of the easiest ways to lose data integrity. Well you can call it human error, but it also boils down to is the system set up in a way that it actually makes sense to enter all that data here and now? If users are not willing to enter all the necessary data, your data sets will never be of a high enough data quality.
System integrating with one another is also a great way to lose data integrity. The moment systems have different implementations of a concepts like Customer or Order and their lifecycles you will have a hard time combining that data.
In order to successfully introduce AI in your business, you have to be in control of your data. That data is created throughout your application landscape. From the earlier iterations of Data Warehouses to the rise to prominence of Data scientists, it’s all about Data, it’s integrity, security and quality. That hasn’t changed.
How you frame your problem will influence how you solve it
According to The Systems Thinker, if a problem meets these four criteria, it could benefit from a systems thinking approach:
The issue is importantThe problem is recurringThe problem is familiar and has known historyPeople have unsuccessfully tried to solve the problem
We need to make sense of the complexity of the application landscape by looking at it in terms of whole and relationships rather than by splitting it down into its parts. Then the flow of data throughout will start to make sense. And then can we start addressing the lack of Data Quality, Data Security and Data Integrity.
And who knows, if that all is solved we can start thinking about where it actually makes sense to introduce AI.
Human Error -> Root cause AWS S3 outage is found
Martijn Veldkamp
“Strategic Technology Leader | Customer’s Virtual CTO | Salesforce Expert | Helping Businesses Drive Digital Transformation”
March 3, 2017
An authorized S3 team member using an established playbook executed a command which was intended to remove a small number of servers for one of the S3 subsystems that is used by the S3 billing process. Unfortunately, one of the inputs to the command was entered incorrectly and a larger set of servers was removed than intended.
The servers that were inadvertently removed supported two other S3 subsystems. One of these subsystems, the index subsystem, manages the metadata and location information of all S3 objects in the region.
While these subsystems were being restarted, S3 was unable to service requests. Other AWS services in the US-EAST-1 Region that rely on S3 for storage, including Elastic Compute Cloud (EC2), Elastic Block Store (EBS) volumes and AWS Lambda were also impacted while the S3 APIs were unavailable.
Read the whole story here
LinkedIn acquired by Microsoft for a whopping $26.1 billion. But why?
LinkedIn is essentially the Facebook of the business world, and the digital repository of most of the world’s resumes. LinkedIn has roughly 100 million members in Europe of a total of 450 million. Very few people lie on their public view-able and controllable resume. And that’s information Microsoft is willing to pay for.
Well Microsoft already knows a lot about you. They have your calendar (Outlook), your meetings (Outlook) and your accounts (Microsoft Dynamics CRM). By buying LinkedIn Microsoft gains even more data to feed into its machine learning and business intelligence processes. Some think to feed Cortana so the start of a business meeting may loook like this:

Right now, Cortana provides some basic information about your calendar, suggesting, for example, what time you’ll need to leave to arrive at your next meeting on time. In Microsoft’s digital future, Cortana will be able to sum up what you need to know about your business relationship, and what information you can use to cement a more personal connection, too.