{"id":67362,"date":"2023-04-27T09:59:27","date_gmt":"2023-04-27T04:29:27","guid":{"rendered":"https:\/\/cyfuture.cloud\/blog\/?p=67362"},"modified":"2023-04-28T10:03:08","modified_gmt":"2023-04-28T04:33:08","slug":"why-many-engineers-dont-understand-serverless","status":"publish","type":"post","link":"https:\/\/cyfuture.cloud\/blog\/why-many-engineers-dont-understand-serverless\/","title":{"rendered":"Why Many Engineers Don&#8217;t Understand Serverless?"},"content":{"rendered":"<div id=\"toc_container\" class=\"no_bullets\"><p class=\"toc_title\">Table of Contents<\/p><ul class=\"toc_list\"><li><a href=\"#Critique_of_Serverless\">Critique of Serverless<\/a><\/li><li><a href=\"#What_Are_Some_Engineers_Missing_The_True_Benefits_of_Serverless\">What Are Some Engineers Missing? The True Benefits of Serverless<\/a><\/li><li><a href=\"#The_low_costs_of_serverless_may_outweigh_any_drawbacks\">The low costs of serverless may outweigh any drawbacks<\/a><\/li><li><a href=\"#The_cold_start_is_a_question_of_configuration_and_budget\">The cold start is a question of configuration and budget<\/a><\/li><li><a href=\"#Techniques_to_improve_the_latency_of_Lambda_functions\">Techniques to improve the latency of Lambda functions<\/a><\/li><li><a href=\"#What_latency_is_acceptable_by_workloads\">What latency is acceptable by workloads?<\/a><\/li><li><a href=\"#Serverless_is_about_NoOps_and_Scalability\">Serverless is about \u201cNoOps\u201d and Scalability<\/a><\/li><li><a href=\"#Use_cases_that_strongly_benefit_from_serverless\">Use cases that strongly benefit from serverless<\/a><\/li><li><a href=\"#Code_speed_vs_speed_of_development_cycles\">Code speed vs. speed of development cycles<\/a><\/li><li><a href=\"#Seamless_integration_with_other_cloud_services\">Seamless integration with other cloud services<\/a><\/li><li><a href=\"#The_Downsides_of_Serverless\">The Downsides of Serverless<\/a><\/li><li><a href=\"#In_a_Nutshell\">In a Nutshell<\/a><\/li><\/ul><\/div>\n\n<p><span style=\"font-weight: 400;\">Serverless computing has emerged as a popular technology in recent years, offering scalable, cost-effective, and flexible solutions for application development. However, despite its benefits, many engineers still struggle to understand and adopt this technology.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">A survey conducted by the Cloud Native Computing Foundation (CNCF) in 2020 revealed that only 27% of respondents were familiar with serverless computing. This suggests that a significant number of engineers have yet to be exposed to the technology, and therefore may not fully understand its potential.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Furthermore, a report from the research firm Gartner found that many organizations have difficulty finding skilled serverless developers, indicating a shortage of understanding and expertise in the field. This shortage can lead to a slower adoption rate of serverless computing, which can have a negative impact on a company&#8217;s competitiveness.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">To learn more about these challenges and how to overcome them, read on for further details.<\/span><\/p>\n<h2><span id=\"Critique_of_Serverless\">Critique of Serverless<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">As with any technology, serverless computing is not without its criticisms. One critique of serverless is that it can be more difficult to manage and monitor than traditional computing, as it involves multiple third-party services and functions that need to be integrated and coordinated. This complexity can make it challenging to troubleshoot and optimize performance, which can lead to increased downtime and decreased productivity.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Another criticism of serverless is that it can lead to vendor lock-in, as organizations become increasingly dependent on specific<\/span><a href=\"https:\/\/cyfuture.cloud\/\"><b> cloud providers<\/b><\/a><span style=\"font-weight: 400;\"> and services. This can limit their flexibility and control over their applications, as well as potentially lead to higher costs and reduced innovation.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Additionally, serverless computing may not be the best fit for all types of applications. Applications with long-running processes, high computational requirements, or real-time data processing needs may not be well-suited to a serverless architecture.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Despite these criticisms, <\/span><a href=\"https:\/\/cyfuture.cloud\/blog\/serverless-computing-the-next-step-in-cloud-infrastructure\/\"><span style=\"font-weight: 400;\">serverless computing <\/span><\/a><span style=\"font-weight: 400;\">continues to gain popularity due to its scalability, cost-effectiveness, and flexibility. Many organizations have successfully adopted serverless and reaped its benefits, and the technology is expected to continue to evolve and improve over time.<\/span><\/p>\n<h2><span id=\"What_Are_Some_Engineers_Missing_The_True_Benefits_of_Serverless\">What Are Some Engineers Missing? The True Benefits of Serverless<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Some engineers may not fully appreciate the technical benefits of serverless computing, including:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Event-driven architecture: Serverless functions are event-driven, meaning that they are triggered by specific events or requests, such as HTTP requests or changes to a database. This allows for a more efficient and responsive architecture that can scale dynamically based on demand.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Function-as-a-service (FaaS) model: Serverless computing is based on a FaaS model, which means that developers can focus on writing code for specific functions rather than managing the underlying infrastructure. This abstraction layer allows for a more streamlined development process and can reduce the amount of time and effort required to develop and deploy applications.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Serverless databases: Many serverless computing platforms offer serverless databases, which can eliminate the need for traditional database management and maintenance tasks. These databases can scale automatically and are designed to work seamlessly with serverless functions.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Resource optimization: Serverless computing platforms can optimize resource allocation based on actual usage patterns, which can lead to significant cost savings. This means that engineers can focus on writing efficient and effective code, rather than worrying about resource allocation and management.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Overall, serverless computing offers a powerful and efficient architecture that can significantly simplify and streamline the development and deployment of applications. By leveraging the event-driven architecture, FaaS model, serverless databases, and resource optimization, engineers can develop and deploy applications that are more scalable, cost-effective, and responsive than traditional computing architectures.<\/span><\/p>\n<h2><span id=\"The_low_costs_of_serverless_may_outweigh_any_drawbacks\"><strong>The low costs of serverless may outweigh any drawbacks<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">The low costs of serverless computing can indeed outweigh any potential drawbacks. By paying only for the resources that are actually used, organizations can significantly reduce their costs and avoid the need for costly upfront investments in infrastructure and hardware.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In addition, serverless computing can offer significant cost savings by eliminating the need for manual scaling and management of resources. This can be particularly beneficial for small to medium-sized businesses that may not have the resources to invest in expensive hardware and infrastructure.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Furthermore, the scalability and flexibility of serverless computing can also enable organizations to innovate and iterate more quickly, which can lead to increased productivity and competitiveness. By leveraging the event-driven architecture and FaaS model, engineers can focus on writing code and developing applications, rather than managing infrastructure and resources.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">While there may be potential drawbacks to serverless computing, such as increased complexity and vendor lock-in, many organizations have successfully adopted serverless and reaped its benefits. By carefully evaluating their needs and considering the potential benefits and drawbacks, organizations can determine whether serverless computing is the right choice for their specific applications and workloads.<\/span><\/p>\n<h2><span id=\"The_cold_start_is_a_question_of_configuration_and_budget\">The cold start is a question of configuration and budget<\/span><\/h2>\n<p><span style=\"font-weight: 400;\">The &#8220;cold start&#8221; problem is a well-known issue in serverless computing, which refers to the delay that can occur when a function is first invoked after being idle for a period of time. This delay is caused by the need to initialize the environment and resources needed to execute the function, which can result in longer response times and reduced performance.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">However, it is important to note that the cold start problem is not necessarily an inherent drawback of serverless computing, but rather a question of configuration and budget. With the right configuration and adequate resources, organizations can mitigate the impact of cold starts and ensure that their applications are performing optimally.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">For example, one solution to the cold start problem is to use &#8220;warm&#8221; functions, which are pre-initialized and ready to respond quickly to requests. This can be achieved by using techniques such as scheduling periodic &#8220;keep-alive&#8221; requests or pre-warming functions in advance of expected spikes in traffic.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In addition, organizations can allocate sufficient resources and optimize their function code to minimize the impact of cold starts. By properly configuring their serverless environment and investing in adequate resources, organizations can ensure that their applications are performing optimally and delivering the desired user experience.<\/span><\/p>\n<h2><span id=\"Techniques_to_improve_the_latency_of_Lambda_functions\"><strong>Techniques to improve the latency of Lambda functions<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">There are several techniques that developers can use to improve the latency of their Lambda functions and mitigate the impact of cold starts. Some of these techniques include:<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8211; <\/span><b>Provisioning concurrency:<\/b><span style=\"font-weight: 400;\"> By increasing the amount of concurrency available to your Lambda functions, you can ensure that there are enough warm instances available to respond quickly to requests. This can help to reduce the impact of cold starts and improve overall performance.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8211; <\/span><b>Using provisioned concurrency: <\/b><span style=\"font-weight: 400;\">With provisioned concurrency, you can pre-warm your Lambda functions and ensure that there are always warm instances available to respond to requests. This can help to eliminate the impact of cold starts altogether and ensure consistent performance.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8211; <\/span><b>Reducing function size:<\/b><span style=\"font-weight: 400;\"> The larger your Lambda function, the longer it will take to initialize and execute. By reducing the size of your function code and dependencies, you can help to reduce the impact of cold starts and improve overall performance.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8211; <\/span><b>Optimizing code:<\/b><span style=\"font-weight: 400;\"> By optimizing your function code and reducing unnecessary processing, you can help to improve performance and reduce latency. This can be achieved by using techniques such as caching, code splitting, and reducing the number of network calls.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">&#8211; <\/span><b>Using a content delivery network (CDN): <\/b><span style=\"font-weight: 400;\">By using a CDN to cache and serve static assets, you can reduce the amount of traffic that needs to be processed by your Lambda functions. This can help to reduce the impact of cold starts and improve overall performance.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">By leveraging these techniques and adopting best practices for serverless development, developers can ensure that their Lambda functions are performing optimally and delivering the desired user experience.<\/span><\/p>\n<h2><span id=\"What_latency_is_acceptable_by_workloads\"><strong>What latency is acceptable by workloads?<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">The acceptable latency for a workload can vary depending on the specific application and use case. For example, a gaming or real-time application may require very low latency to ensure a seamless user experience, while a batch processing or analytics application may be more tolerant of higher latency.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In general, most applications require response times of under a few seconds to ensure that the user experience is acceptable. However, the exact acceptable latency will depend on the specific application requirements and user expectations.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">When designing serverless applications, it is important to carefully evaluate the acceptable latency for each workload and optimize the environment and resources accordingly. By leveraging techniques such as provisioning concurrency, pre-warming functions, and optimizing code, developers can ensure that their applications are performing optimally and meeting the desired latency requirements.<\/span><\/p>\n<h2><span id=\"Serverless_is_about_NoOps_and_Scalability\"><strong>Serverless is about \u201cNoOps\u201d and Scalability<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Serverless computing is often referred to as &#8220;NoOps&#8221; because it enables developers to focus on writing code and developing applications, rather than managing infrastructure and resources. By abstracting away the underlying infrastructure and providing a fully managed environment, serverless computing allows developers to deploy and scale their applications quickly and easily, without the need for extensive DevOps resources.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">In addition to the benefits of NoOps, serverless computing also provides significant scalability benefits. By leveraging the event-driven architecture and function-as-a-service (FaaS) model, serverless applications can automatically scale up or down in response to changes in demand. This can help to ensure that the application is always available and performing optimally, without requiring manual intervention or resource allocation.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Furthermore, the scalability benefits of serverless computing can also enable organizations to innovate and iterate more quickly, which can lead to increased productivity and competitiveness. By removing the need for manual scaling and resource management, developers can focus on writing code and developing applications, rather than managing infrastructure and resources.<\/span><\/p>\n<h2><span id=\"Use_cases_that_strongly_benefit_from_serverless\"><strong>Use cases that strongly benefit from serverless<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Serverless computing can provide benefits across a wide range of use cases and application types, but there are several areas where it can be particularly advantageous. Here are some of the use cases that strongly benefit from serverless:<\/span><\/p>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Web and Mobile Applications:<\/b><span style=\"font-weight: 400;\"> Serverless computing can be a great fit for web and mobile applications that have unpredictable traffic patterns or require high scalability. With serverless, developers can deploy functions that automatically scale in response to changes in demand, without needing to manage infrastructure resources.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Event-driven applications: <\/b><span style=\"font-weight: 400;\">Event-driven applications, such as those used for IoT, machine learning, and real-time data processing, can benefit from serverless computing&#8217;s event-driven architecture. Serverless can provide a highly scalable and efficient way to process large volumes of events in real-time.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Batch processing: <\/b><span style=\"font-weight: 400;\">Batch processing applications that require high processing power and the ability to scale quickly can benefit from serverless computing&#8217;s ability to quickly scale up and down. This can help to reduce processing times and improve overall efficiency.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Chatbots and voice assistants: <\/b><span style=\"font-weight: 400;\">Chatbots and voice assistants require highly responsive and scalable back-end processing to deliver fast and reliable user experiences. With serverless computing, developers can easily create and deploy functions that handle user interactions, data processing, and integrations with third-party services, without having to worry about managing servers or infrastructure.<\/span><\/li>\n<\/ol>\n<ol>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>API development: <\/b><span style=\"font-weight: 400;\">Serverless computing can be an ideal option for building and deploying APIs that require high scalability and availability. Developers can create serverless functions that handle API requests and automatically scale up or down in response to changes in demand.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>Microservices:<\/b><span style=\"font-weight: 400;\"> Serverless computing can be used to develop and deploy microservices that can be independently scaled and managed. By breaking down applications into smaller, more modular components, developers can create highly scalable and efficient systems that can be easily updated and maintained.<\/span><\/li>\n<li style=\"font-weight: 400;\" aria-level=\"1\"><b>DevOps automation: <\/b><span style=\"font-weight: 400;\">Serverless computing can be used to automate DevOps processes such as continuous integration and delivery (CI\/CD). By creating serverless functions that automatically build, test, and deploy code, developers can streamline the development process and reduce the need for manual intervention.<\/span><\/li>\n<\/ol>\n<h2><span id=\"Code_speed_vs_speed_of_development_cycles\"><strong>Code speed vs. speed of development cycles<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">In software development, there is often a trade-off between code speed and speed of development cycles. Code speed refers to the performance and efficiency of the code, while speed of development cycles refers to the speed at which developers can create, test, and deploy new features and updates.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">With traditional development approaches, there is often a focus on code speed, with developers spending significant time optimizing code for performance and efficiency. While this can result in highly performant applications, it can also slow down the development cycle and make it difficult to iterate quickly.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Serverless computing can help to balance the trade-off between code speed and development cycle speed. By abstracting away the underlying infrastructure and providing a fully managed environment, serverless computing can allow developers to focus on writing code and developing applications, rather than managing infrastructure and resources.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">This can help to speed up the development cycle and enable organizations to iterate more quickly, while still ensuring that the code is highly performant and efficient. Additionally, serverless computing&#8217;s automatic scaling and event-driven architecture can help to ensure that the application is always available and performing optimally, without requiring manual intervention or resource allocation.<\/span><\/p>\n<h2><span id=\"Seamless_integration_with_other_cloud_services\"><strong>Seamless integration with other cloud services<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">One of the key benefits of serverless computing is its seamless integration with other cloud services. With serverless, developers can easily integrate their code with other cloud services, such as databases, storage, messaging, and event services, without having to manage infrastructure or worry about compatibility issues.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">For example, with AWS Lambda, developers can integrate their code with other AWS services such as Amazon S3, Amazon DynamoDB, and Amazon API Gateway, using built-in integrations and APIs. This allows developers to easily create serverless applications that can process and store data, interact with other applications, and respond to events in real-time.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">Additionally, serverless computing can integrate with third-party services through APIs and webhooks. This allows developers to easily incorporate third-party services, such as payment gateways, authentication providers, and machine learning services, into their serverless applications.<\/span><\/p>\n<p><span style=\"font-weight: 400;\">By leveraging the seamless integration capabilities of serverless computing, developers can create highly efficient and scalable applications that can easily integrate with other cloud services and third-party providers. This can help to accelerate development cycles, reduce costs, and improve overall application performance and functionality.<\/span><\/p>\n<h2><span id=\"The_Downsides_of_Serverless\"><strong>The Downsides of Serverless<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">While serverless computing has many benefits, there are also some downsides to consider. Here are a few potential drawbacks:<\/span><\/p>\n<p><b>Vendor lock-in:<\/b><span style=\"font-weight: 400;\"> Adopting a serverless architecture often means relying heavily on a single cloud provider&#8217;s platform and services. This can create vendor lock-in, making it difficult and costly to migrate to a different platform if needed.<\/span><\/p>\n<p><b>Limited control:<\/b><span style=\"font-weight: 400;\"> While serverless computing can provide developers with greater flexibility and productivity, it also limits their control over the underlying infrastructure. This can make it difficult to troubleshoot issues, customize performance, or optimize resources for specific use cases.<\/span><\/p>\n<p><b>Cold start delays: <\/b><span style=\"font-weight: 400;\">As we mentioned earlier, cold starts can cause latency issues for serverless functions, particularly those with infrequent usage. While techniques exist to mitigate cold start delays, they can add complexity to the development process.<\/span><\/p>\n<p><b>Debugging challenges:<\/b><span style=\"font-weight: 400;\"> Debugging serverless applications can be challenging, particularly for complex or distributed applications. Debugging tools and techniques must be adapted to account for the distributed and ephemeral nature of serverless architectures.<\/span><\/p>\n<p><b>Increased complexity:<\/b><span style=\"font-weight: 400;\"> Serverless architectures can add complexity to an application&#8217;s design and implementation, particularly as applications grow in size and complexity. This can require specialized knowledge and expertise, potentially slowing down development cycles.<\/span><\/p>\n<h2><span id=\"In_a_Nutshell\"><strong>In a Nutshell<\/strong><\/span><\/h2>\n<p><span style=\"font-weight: 400;\">Serverless computing is a <\/span><a href=\"https:\/\/cyfuture.cloud\/blog\/cloud-computing-is-set-to-overhaul-ed-tech-in-2023\/\"><b>cloud computing<\/b><\/a><span style=\"font-weight: 400;\"> model that allows developers to run their code in a fully managed environment, without the need to manage underlying infrastructure. This approach can provide several benefits, including increased productivity, scalability, and reduced costs. However, there are also potential drawbacks to consider, such as vendor lock-in, limited control over infrastructure, cold start delays, debugging challenges, and increased complexity. Ultimately, organizations must carefully consider the benefits and drawbacks of serverless computing before adopting this approach, and ensure that it is the right fit for their specific use cases and workloads.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Table of ContentsCritique of ServerlessWhat Are Some Engineers Missing? The True Benefits of ServerlessThe low costs of serverless may outweigh any drawbacksThe cold start is a question of configuration and budgetTechniques to improve the latency of Lambda functionsWhat latency is acceptable by workloads?Serverless is about \u201cNoOps\u201d and ScalabilityUse cases that strongly benefit from serverlessCode speed [&hellip;]<\/p>\n","protected":false},"author":34,"featured_media":67363,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[517],"tags":[],"acf":[],"_links":{"self":[{"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/posts\/67362"}],"collection":[{"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/users\/34"}],"replies":[{"embeddable":true,"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/comments?post=67362"}],"version-history":[{"count":1,"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/posts\/67362\/revisions"}],"predecessor-version":[{"id":67364,"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/posts\/67362\/revisions\/67364"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/media\/67363"}],"wp:attachment":[{"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/media?parent=67362"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/categories?post=67362"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/cyfuture.cloud\/blog\/wp-json\/wp\/v2\/tags?post=67362"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}