To deploy your Docker application on Google Cloud Platform (GCP), you can follow these general steps:
Create a Google Cloud project: If you haven’t done so already, create a project on the GCP Console (console.cloud.google.com) and enable the necessary APIs and services, such as the Compute Engine and Container Registry.
Build and push your Docker image: Build your Docker image locally using the Dockerfile in your application’s directory. Once the image is built, you need to push it to a container registry on GCP. The Container Registry allows you to store and manage your container images. You can use the following command to push your image:
Replace your-project-id with your GCP project ID and your-image-name with the desired name for your image.
Create a Compute Engine instance: In GCP, you can use a Compute Engine instance as a virtual machine to run your Docker containers. Create a Compute Engine instance with the desired configuration, including the desired machine type, disk size, and networking settings. Ensure that Docker is installed on the instance.
SSH into the Compute Engine instance: Once the instance is created, SSH into the instance using the SSH button provided on the Compute Engine instance details page. This will open a terminal window where you can execute commands on the instance.
Pull and run your Docker image: On the Compute Engine instance, pull your Docker image from the Container Registry using the following command:
After pulling the image, you can run it as a container:
docker run -d -p 80:80 gcr.io/your-project-id/your-image-name
This command runs the container in the background (-d flag) and maps port 80 of the host machine to port 80 of the container (-p 80:80 flag).
Configure firewall rules: By default, incoming traffic to the Compute Engine instance is blocked. If your application needs to be accessible from the internet, you must configure firewall rules to allow incoming traffic to the desired ports (e.g., port 80). In the GCP Console, navigate to the “VPC network” section and create a firewall rule allowing traffic on the desired port.
Access your application: Once the container is running and the firewall rules are configured, you should be able to access your application by navigating to the external IP address of your Compute Engine instance in a web browser.
These steps outline deploying a Docker application on GCP using Compute Engine. YMMV.
WordPress is a popular content management system that powers millions of websites around the world. One of the crucial aspects of managing a WordPress site is keeping it up to date, including its core files. WordPress updates are important as they usually contain bug fixes, new features, and security patches that ensure the site remains secure, stable, and up to date.
The conventional way to update WordPress is by using the WordPress dashboard. However, this method can be time-consuming and sometimes challenging to complete. Fortunately, there is an easier and more efficient way to update WordPress core files using WP-CLI.
This article will provide you with a step-by-step guide on updating WordPress core files using WP-CLI.
What is WP-CLI?
WP-CLI is a command-line interface for WordPress. It allows users to manage their WordPress site from the command line, which can be more efficient than using the graphical interface. WP-CLI can help you do a variety of tasks, such as updating WordPress core files, installing and updating plugins and themes, managing users, and creating new posts and pages.
Updating WordPress Core Using WP-CLI
Updating WordPress core files using WP-CLI is easy and straightforward. Here’s how to do it:
Step 1: Backup Your Site
Before updating WordPress core files, it is important to back up your site, including your database and files. This ensures that you can always restore your site to its previous state if anything goes wrong during the update.
Step 2: Access Your Site via SSH
Next, access your website through a terminal program using Secure Shell (SSH) protocol. SSH is a secure and encrypted protocol that allows you to access your server’s terminal securely. If you don’t know how to access your site via SSH, contact your web host provider for assistance.
Step 3: Navigate to the WordPress Root Directory
Navigate to the root directory of your WordPress site using the cd command in the terminal. This directory contains all the WordPress files and folders, including the wp-admin, wp-includes, and wp-content folders.
Step 4: Check for Updates
To check if there are updates available for WordPress core files, use the following command:
wp core check-update
This command will display the current version of WordPress you are running and if there are any updates available.
Step 5: Update WordPress Core
To update WordPress core files, use the following command:
wp core update
This command will update your WordPress core files to the latest version available.
Step 6: Verify the Update
After updating WordPress core files, verify the update by visiting your website and checking if everything is working correctly.
Conclusion
Updating WordPress core files is essential to keep your site secure and up to date. By using WP-CLI, you can update your WordPress core files with ease and efficiency. With this step-by-step guide, you can confidently update your WordPress core files using WP-CLI and keep your website running smoothly and securely.
As a beginner using WordPress, keeping your website secure and stable is crucial. One of the best ways to do this is by regularly updating your plugins to their latest versions. However, manually updating each plugin one by one can be time-consuming and tedious, especially if you have a lot of plugins installed. This is where WP-CLI’s “wp plugin update –all” command comes in handy.
In this article, we’ll cover everything you need to know about using WP-CLI’s “wp plugin update –all” command to keep your WordPress site secure and stable. We’ll cover the basics of WP-CLI, explain how to use the “wp plugin update –all” command, and give you some tips on best practices for keeping your plugins up to date.
What is WP-CLI?
WP-CLI is a command-line interface for WordPress that allows you to perform a wide range of tasks, such as installing WordPress, managing plugins and themes, and creating posts and pages, all without using the WordPress admin interface. WP-CLI can save you a lot of time and effort by automating many common WordPress tasks and making it easier to manage your website.
How to Use the “wp plugin update –all” Command
To use the “wp plugin update –all” command, you’ll need to have WP-CLI installed on your server. If you haven’t already done so, you can follow the instructions on the WP-CLI website to install it.
Once you have WP-CLI installed, you can navigate to the root directory of your WordPress installation and run the “wp plugin update –all” command. WP-CLI will then check for updates for all of the plugins installed on your site and update them to their latest versions.
Best Practices for Updating Your Plugins
While using the “wp plugin update –all” command is a great way to keep your plugins up to date, there are some best practices you should follow to ensure that your website remains secure and stable:
Create a backup of your website before updating your plugins. This will give you a safety net in case anything goes wrong during the update process.
Check for plugin compatibility issues before updating. Some plugins may not be compatible with the latest version of WordPress or with other plugins on your site. Always check for compatibility issues before updating to avoid any potential problems.
Update your plugins regularly. Plugin developers release updates for a reason, often to fix security vulnerabilities or improve performance. By updating your plugins regularly, you’ll ensure that your site is secure and running smoothly.
Test your site after updating your plugins. After updating your plugins, it’s a good idea to test your site to make sure everything is working as expected. This will help you catch any issues early and avoid any potential problems for your visitors.
By following these best practices, you can ensure that your WordPress site remains secure and stable, and that your visitors have a great experience when visiting your site.
Conclusion
WP-CLI’s “wp plugin update –all” command is a powerful tool for keeping your WordPress site secure and stable. By regularly updating your plugins, you can ensure that your site is running smoothly and free from security vulnerabilities. With the tips and best practices we’ve covered in this article, you’ll be able to use WP-CLI’s “wp plugin update –all” command with confidence and keep your WordPress site in top shape.
Losing your website can be a nightmare for any website owner. Unfortunately, I experienced this firsthand when I accidentally executed the rm -rf command on my public_html folder while working on my WordPress site. In a split second, my website disappeared into thin air, leaving me with a sinking feeling in my stomach.
After trying every trick in the book to recover my website, I realized that I had lost everything, including my precious database. That is when I decided to set up a backup system to prevent such disasters from happening again.
I researched various backup solutions but decided to write a PHP script to backup my MySQL database periodically. Here is why I chose this approach:
Easy to set up: The PHP script I wrote was straightforward to set up, even for a non-technical person like me. All I had to do was edit the configuration file with my database credentials and set up a cron job to run the script at regular intervals.
Efficient: The script only backs up the necessary tables in my database, which saves me a lot of time and disk space. I don’t have to worry about backing up unnecessary data, making the process much more efficient.
Cost-effective: Writing my own PHP script was free, which meant I didn’t have to spend any money on a backup solution. This was particularly important for me as a small business owner who wanted to minimize expenses.
Peace of mind: Knowing that I have a backup of my database gives me peace of mind. I no longer have to worry about losing my website or data, which allows me to focus on growing my business.
Losing my website was a painful experience, but it taught me a valuable lesson about the importance of backups. By writing a simple PHP script to periodically backup my MySQL database, I can now rest easy knowing that my website and data are secure. If you haven’t already done so, I highly recommend that you set up a backup system for your website. It may just save you from a nightmare like mine.
To use this script, replace the database credentials with your own and save it as a PHP file (e.g., backup.php) on your server. Then, set up a cron job to run the script at your desired intervals (e.g., once a day or once a week).
Note that this script only backs up the database and not the files on your server. You may want to consider backing up your files as well, either manually or through a separate backup solution.
How to schedule it to run 4 AM every morning?
Here’s an example cron job that will run the PHP script every day at 4 AM Pakistan Standard Time:
In this example above, 0 4 * * * specifies the time and date the cron job will run, which translates to “run at 4:00 AM every day”. /usr/bin/php is the path to the PHP binary, and /path/to/backup.php is the path to your PHP backup script. The >/dev/null 2>&1 part is used to redirect the script’s output to /dev/null, which discards it, so you don’t receive any emails with output from the cron job.
Make sure to replace /path/to/backup.php with the actual path to your PHP script. You can add this cron job by editing your crontab file using the crontab -e command.
Crontab is a command that allows you to schedule commands or scripts to run automatically at specific times or intervals. It’s a very useful tool for automating repetitive tasks, such as running backups, updating files, or sending emails.
Here’s how to use crontab:
Open a terminal window on your server.
Type this command to edit your crontab file: crontab -e This will open your crontab file in the default text editor.
In the crontab file, add a new line for the command or script you want to run. The format for the line is: * * * * * command The five asterisks represent the minute, hour, day of the month, month, and day of the week when the command will run. You can use numbers or special characters (e.g., * for all values) to specify when the command should run.
After adding your command to the crontab file, save and exit the text editor.
Your new crontab entry is now active and will run automatically at the specified times.
Note that crontab uses the system time zone, so make sure to specify the correct time zone in your commands or scripts if necessary.
Overall, crontab is a powerful tool that can help you automate many tasks on your server. By using crontab, you can save time and increase the efficiency of your workflows.
Ikigai is a book that delves into the Japanese concept of finding one’s purpose in life. The author, Héctor García, does an excellent job of explaining the concept and how it can be applied to one’s own life. The book is filled with stories and examples from Japanese culture and from the author’s own experience, which really helps to bring the concept to life.
One of the things that I appreciated about the book is the way the author breaks down the concept of Ikigai into four key elements: passion, mission, vocation, and profession. He explains that in order to find your Ikigai, you must find something that you love, something that the world needs, something that you can get paid for, and something that you’re good at. The book then goes on to explore each of these elements in more detail.
I also liked how the author emphasizes the importance of living in the present and taking small steps toward achieving your Ikigai. He stresses the importance of taking things one day at a time and not getting bogged down by the overwhelming task of finding your purpose in life. He also encourages readers to be open to new experiences and not be afraid to try something new.
The book is filled with practical tips and exercises that readers can use to help them find their Ikigai. I found these to be particularly helpful, as they provided a tangible way to apply the concepts from the book to my own life.
One of the things that I found most valuable about the book is the way it encourages readers to think about their own lives in a new way. It challenges readers to question the status quo and to think about what truly brings them happiness and fulfillment. It encourages readers to think about their passions, their values, and their strengths, and to consider how these can be incorporated into their lives in a meaningful way.
Overall, I highly recommend Ikigai to anyone who is searching for a sense of purpose and meaning in their life. The book is well-written, easy to read, and filled with valuable insights and practical tips. It’s a valuable resource for anyone who is looking to find their Ikigai and live a more fulfilling life.
Marie Kondo and Ikigai. Is there a correlation between KonMari and Ikigai?
Marie Kondo and Ikigai share some similarities in that they both encourage individuals to focus on what brings them joy and fulfillment in their lives. The KonMari method, developed by Marie Kondo, focuses on decluttering and organizing one’s physical space in order to create a more peaceful and fulfilling environment. The idea is that by getting rid of items that no longer spark joy, individuals can free themselves from unnecessary distractions and focus on the things that truly matter to them.
Ikigai, on the other hand, is a Japanese concept that centers around finding one’s purpose and meaning in life. It encourages individuals to think about their passions, values, and strengths, and to consider how these can be incorporated into their lives in a meaningful way.
Both Marie Kondo and Ikigai share a focus on finding what brings joy and fulfillment in life. KonMari method helps to declutter the physical space which in a way declutters the mind as well, making it easier for an individual to focus on what brings them true joy and fulfillment. And Ikigai helps in finding the purpose and meaning in life, the joy and fulfillment come from knowing what one’s purpose is.
While the two concepts are distinct, they can complement each other in helping individuals to create a more harmonious and fulfilling life. By decluttering their physical space and getting rid of things that no longer serve them, individuals may find it easier to focus on what truly matters to them and pursue their Ikigai.
Are you looking to take your online courses and e-learning to the next level? Look no further than our LearnDash services.
We are experts in creating custom LearnDash websites that are tailored to meet the unique needs of your business. Whether you’re a small business looking to create an online course or a large enterprise looking to scale your e-learning, we’ve got you covered.
Custom Theme Development:
We understand that your brand is unique, and your online learning platform should reflect that. That’s why we offer custom theme development to ensure that your LearnDash website aligns with your brand and provides an engaging user experience.
Our team of experts stays up-to-date with the latest design trends and techniques to guarantee that your website is visually pleasing and easy to navigate.
Payment Gateway Integration:
We know that making a purchase should be seamless for your users. That’s why we offer integration with various payment gateways like PayPal, Stripe, and Authorize.net, to make the process of purchasing your content as smooth as possible.
SEO Optimization:
SEO is crucial for any online business, and we make sure that your LearnDash website is optimized to the max for visibility and search engine ranking.
Custom Functionality:
We provide custom functionality like content dripping, user analytics, and custom reporting, to give you a better understanding of your users and improve their experience.
Ongoing Support and Maintenance:
We understand that the needs of your business may change over time. That’s why we offer ongoing support and maintenance services to keep your LearnDash website up-to-date and in line with your business needs.
OK, but why should you hire us?
Our team of experts is passionate about providing you with the best possible service, and we’re dedicated to making sure that your LearnDash website is a success.
Don’t miss out on the opportunity to elevate your online learning experience. Contact us today to learn more about our custom LearnDash services and take the first step towards an exceptional online learning platform.
To succeed as a web developer in 2023, it is important to stay up-to-date with the latest technologies and trends. Here are a few key areas to focus on:
Learn modern web development frameworks such as React, Angular, and Vue. These frameworks are widely used in the industry and will give you a solid foundation for building web applications.
Get familiar with JavaScript and its ecosystem. JavaScript is the most popular programming language for web development, and it’s essential to know it well to be successful.
Learn about web performance and optimization. With the increasing use of mobile devices, it’s crucial to make sure your web applications load quickly and efficiently.
Understand the basics of cloud computing. Cloud platforms such as AWS, Azure, and Google Cloud are widely used in the industry, and knowledge of these platforms will give you an edge.
Continuously improve your skills and stay up-to-date with the latest web development trends. Web development is a rapidly changing field, so it’s important to continuously learn new technologies and best practices.
Networking and collaborations with other web developers and companies will be key to success in the field.
Building a portfolio of your work and showcasing it online will help you in getting hired by companies.
Overall, to succeed as a web developer in 2023, it’s important to have a solid understanding of the latest technologies and trends and to continuously improve your skills through learning and practice.
Deep Work: Rules for Focused Success in a Distracted World, by Cal Newport, is a book that explores the importance of focus and concentration in today’s digital age. The author argues that the ability to focus on a task without distractions is becoming increasingly valuable in today’s fast-paced and hyper-connected world.
Newport starts by defining “deep work” as “professional activities performed in a state of distraction-free concentration that pushes your cognitive capabilities to their limit.” He argues that deep work is essential for producing high-quality work and achieving success in today’s economy.
The book is divided into two parts: “The Idea” and “The Rules.” In the first part, Newport explains the concept of deep work and its importance in today’s world. He cites studies and examples of successful individuals, such as Bill Gates and J.K. Rowling, who have used deep work to achieve their goals. Newport also addresses the negative effects of constant distractions and multitasking on our ability to focus and perform deep work.
In the second part of the book, Newport provides “rules” for incorporating deep work into our daily lives. He suggests setting specific goals for deep work, scheduling time for deep work, and eliminating distractions. Newport also recommends using “time blocking” to schedule time for deep work and “time dilation” to make time for deep work by focusing on the most critical tasks.
One of the key takeaways from the book is the idea that deep work is not only important for individuals but also organizations. Newport argues that companies that prioritize deep work will have a competitive advantage in today’s economy.
One of the most valuable aspects of the book is how Newport provides practical advice and strategies for incorporating deep work into our daily lives. He acknowledges that it can be difficult to focus in today’s digital age and provides specific tips and tricks to help readers achieve a state of deep work.
In conclusion, Deep Work is a valuable read for anyone looking to improve their productivity and achieve success in today’s economy. The author presents a compelling argument for the importance of focus and concentration and provides practical advice for incorporating deep work into our daily lives. The book is well-researched provoking, and is a must-read for anyone looking to improve their productivity and achieve their goals.
In the end, the book is a great reminder that deep work is the key to success in the knowledge economy and that it is more important than ever to make time for it. The book is well-written, easy to read, and provides practical advice that can be put into practice immediately. Overall, it is a great read for anyone looking to improve their productivity and achieve their goals.
My thoughts about the Audible version
There are several reasons why someone might prefer listening to the Deep Work audiobook over reading it from the paper:
Convenience: Listening to an audiobook allows for multitasking and allows the listener to listen to the book while doing other activities such as driving, exercising, or doing household chores.
Accessibility: An audiobook can be a great option for people who have difficulties reading from paper, such as those with visual impairments or dyslexia.
Variety in narration: Listening to an audiobook allows the listener to experience the book in a different way, with the added dimension of the narrator’s performance. A well-narrated audiobook can bring the author’s words to life in a way that reading from paper cannot.
Professional narration: A professionally narrated audiobook can offer a high-quality listening experience, with clear and engaging narration.
Commuting: For people who have a long commute, an audiobook can be a great way to make the most of that time.
Overall, the Deep Work audiobook, narrated by Jeff Bottoms, offers a convenient and engaging way to experience the book and can be a great option for those who want to access the book while multitasking, have difficulty reading from paper, or want to hear a professional narration.
As a software engineering team with over two decades of experience building customer-centric applications, we understand the importance of creating a personalized and seamless user experience. That’s why we offer MemberPress membership site development services to help you take your online business to the next level.
Our team of WordPress and MemberPress experts has extensive experience in creating custom MemberPress membership sites that are tailored to meet the unique needs of your business. We understand that every business is different and that’s why we work closely with you to understand your specific requirements and design a membership site that is tailored to your specific business needs.
Our services include (but are not limited to):
Custom Theme Development: We can design and develop custom themes for your MemberPress membership site that align with your brand and provide an engaging user experience. We use the latest design trends and techniques to ensure that your site is visually appealing and easy to navigate.
Payment Gateway Integration: We can integrate your MemberPress membership site with various payment gateways such as PayPal, Stripe, and Authorize.net, but also we can do custom payment gateway integration to ensure that your users can easily purchase and access your content.
Custom Functionality: We can add custom functionality to your MemberPress membership site such as content dripping, user analytics, and custom reporting to help you better understand your users and improve their experience.
Bot integration: We can integrate your MemberPress membership site with various chatbot platforms like Dialogflow, ManyChat, etc. to automate the customer service process and improve the user experience.
Security and Performance Optimization: We understand the importance of security and performance for your online business. That’s why we ensure that your MemberPress membership site is secure and optimized for maximum performance.
Ongoing Support and Maintenance: We understand that your business needs may change over time, that’s why we offer ongoing support and maintenance services to ensure that your MemberPress membership site stays up to date and continues to meet your business needs.
Please get in touch via our email [email protected] referencing this page and you’ll get a special discount.
OK, for years I have been pretending that I read books, while I have never read more than 20 books a year. At least that’s what my Goodreads stats say. But I have planned to change that for good now.
Saturday, April 13: Read 11 pages. The chapter is interesting. It talks of a dystopian future and lays out the scene for the story that is going to be told. I can’t believe I bought this book in 2014 and then never read it. The protagonist is a bounty hunter in a distant future where the earth is plagued by radioactivity and the technology has come to a point where electric sheep look exactly like the real ones. But people now want the real ones for some reason. I see, the point maybe is, people just want what’s rare and different. It sounds a lot like a chapter from a coursebook that we read back in the day. It was in Urdu and the title was, “Man is never happy”. I believe this is going to be one of those stories. I don’t know. Let’s see how it goes.
I found “Do Androids Dream of Electric Sheep” to be a thought-provoking and intellectually stimulating read. Philip K. Dick presents a dystopian vision of a future where technology has surpassed humanity and raises important ethical questions about the nature of consciousness and artificial intelligence.
The novel’s portrayal of advanced androids, almost indistinguishable from real humans yet lacking empathy, is a cautionary tale about the dangers of creating artificial intelligence that mimics human behaviour without possessing a true consciousness. This serves as a reminder of the importance of considering the moral implications of our technological advancements.
Additionally, the theme of authenticity, represented by the characters’ longing for real animals amidst a world of artificiality, highlights the human desire for genuine and unique experiences. It serves as a commentary on the human condition and our tendency to value what is rare and authentic.
The author’s unique, stream-of-consciousness narrative style adds to the overall atmosphere of the novel and provides a complex and multi-layered reading experience. The story’s protagonist, Rick Deckard, is a morally ambiguous character whose inner turmoil reflects the larger existential questions raised by the novel.
Overall, “Do Androids Dream of Electric Sheep” is a thought-provoking exploration of the consequences of creating artificial intelligence that can mimic human behaviour. It is a novel that will make the reader question the nature of humanity and the ethics of technology. As someone with a background in both computer science and humanities, I found it to be a valuable addition to my literary repertoire and would highly recommend it to others with similar interests.
In your wp-config.php add the following code, preferably on top
define( 'FS_METHOD', 'direct' );
Also, see if your write permissions are good. Here is how to do that via FTP
Also, you can see which user on your server is the owner of the WordPress installation directory. You can find out via the following PHP code in any file.
<?php whoami(); ?>
Let’s say it is not owned by your current user, you can change that to your web server’s username. If you can use your Linux terminal and Chown command.
1 thought on “WordPress Solution: Updating failed. Error message: The response is not a valid JSON response.”
I noticed that this error comes when you have more than 50 revisions of the same post/page.
I just cleared all the revisions using the wp optimize plugin and it’s gone. After that, I put a limit to post revisions to prevent that error and it’s working perfectly fine.
Since the recent update to Mac OS Catalina and forced update to zsh, I was having problem with a lot of my CLI tools, including flutter. The reason being, zsh uses ~/.zshrc and not ~/.bash_profile
So to solve the problem, simply add your flutter path to ~/.zshrc like this
export PATH="$HOME/your_path/flutter/bin:$PATH"
Change your_path to where you had flutter downloaded and extracted on the earlier version of Mac OS.
So I recently upgraded to macOS Catalina and all hell broke loose. Many of my command line utilities kept working but I have definitely wasted close to 30 hours trying to fix everything to make it work just like it did in the previous macOS version.
Recently, I am started working on a demo for my talk at the Wordcamp Islamabad and wanted to try the latest tools for building a react native app. So naturally, I wanted to try expo CLI as it gives all the tools needed to work with react native on number of platforms i.e web, iOS and Android.
I installed the expo CLI using the following command:
npm install -g expo-cli
But when I tried to run expo, it gave me this error
zsh: command not found:
Have rarely worked with zsh and not having any command over it, I tried many ways to fix it, including the path of the npm root which I got via:
npm root
Apparently, it’s not a reliable way to do it. So I ended up adding
export PATH="$HOME/.npm-packages/bin:$PATH
to my ~/.zshrc file. Which solved the problem for good.
The “timeit” module lets you measure the execution time of small bits of Python code. This can help you find the execution time of your code and thus help in a quick performance improvement of your code. A tiny example follows.
>>> import timeit
>>> timeit.timeit('"-".join(str(n) for n in range(100))', number=10000)
0.2938678440004878
>>> timeit.timeit('"-".join([str(n) for n in range(100)])', number=10000)
0.26015590599854477
>>> timeit.timeit('"-".join(map(str, range(100)))', number=10000)
0.26461737899808213
Please note that every time, the execution time varies for the same snippet. in the first two cases. The third one does the same thing but has a different execution time. This profiling helps with performant code going to your production.
Also, this different execution time for the same exact code depends on a lot of factors, the major one being how busy your CPU was at the time of executing this code. The module function timeit.timeit(stmt, setup, timer, number) accepts four arguments:
stmt which is the statement you want to measure; it defaults to ‘pass’.
setup which is the code that you run before running the stmt; it defaults to ‘pass’. We generally use this to import the required modules for our code.
timer which is a timeit.Timer object; it usually has a sensible default value so you don’t have to worry about it.
number which is the number of executions you’d like to run the stmt.
This post is one of my new series solving one problem per day.
For those of you solving coding challenges for your next software engineering job, here is one more problem.
This problem was asked by Uber.
Problem Statement
Given an array of integers, return a new array such that each element at index i of the new array is the product of all the numbers in the original array except the one at i.
For example, if our input was [1, 2, 3, 4, 5], the expected output would be [120, 60, 40, 30, 24]. If our input was [3, 2, 1], the expected output would be [2, 3, 6].
Here is a solution I came up with
def product_ar(arr):
new_arr = []
for i in range(0,len(arr)):
new_arr.append(multiply_all(arr[0:i], arr[i+1:len(arr)]) )
return new_arr
def multiply_all(arr1, arr2):
product = 1
for item in arr1:
product *= item
for item in arr2:
product *= item
return product
# Some tests
print(product_ar([1, 2, 3, 4, 5]) == [120, 60, 40, 30, 24])
print(product_ar([3,2,1]) == [2,3,6])
Have you ever considered including a little bit more than just your articles?
I mean, what you say is fundamental and everything.
Nevertheless, imagine if you added some great images or videos to give your posts more, “pop”! Your content is excellent but with images and video clips, this website could certainly be one of the most beneficial in its niche.
# Different ways to test multiple
# flags at once in Python
x, y, z = 0, 1, 0
if x == 1 or y == 1 or z == 1:
print('passed')
if 1 in (x, y, z):
print('passed')
# These only test for truthiness:
if x or y or z:
print('passed')
if any((x, y, z)):
print('passed')
So I was writing an A/B testing and was looking for the default jQuery method for loading an external script. So using the jQuery.getScript, you can do exactly that; load a script after a web page is already loaded and perform actions once it is loaded into DOM.
[codepen_embed height=”353″ theme_id=”light” slug_hash=”qoBXxW” default_tab=”js,result” user=”fahdi” preview=”true” data-preview=”true”]See the Pen jQuery.getScript demo by Fahad Murtaza (@fahdi) on CodePen.[/codepen_embed]
In cases when you have access to code and not to an existing admin account, here is a way to create an admin account for yourself without waiting for someone else to create it for you.
In your `functions.php` file, use the following code:
Save the file and access your wordpress admin area. Use the settings you updated in above code i.e the same username and password and you’d be able to access WordPress as admin. That’s all!
Remember:
With great power comes great responsibility. ~ Not spiderman
So if you are a developer, you might need to clean up comments from a code that you might have copied from somewhere else and updating the documentation needs you to start from scracth, or you just want to get rid of comments. Afterall, it can’t be called code if can be understood. Just kidding!
For Javascript and C/C++ or PHP comments or basically any language that uses this syntax for comments
/* Your comment goes here*/
or
/**
* Your comments here
* Your comments here again
*/
the search query would be like:
/\*(.|[\r\n])*?\*/
just make sure to select regex like I did in my case in the screenshot below, so search with the following and replace with
Regex checkbox in PhpStorm IDE
For HTML comments, you’d use something along these lines for the regex pattern for your search and replace query
So imagine your MySQL is on a shared host and you have time stamps which are automatically inserted in the table. In this case, updating MySQL’s time zone settings is beyond your control. Or even if you can set the time zones, for any records that are in a specific time zone, you can add hours or minutes to the timestamp you get from the database in the PHP code.
Assuming your record object is $record and the timestamp field is ‘timestamp’, the following code will allow you to add 5 hours to your timestamp and print in the d/m/Y format which you can update in the code below.
So in a recent project, the hardware being used was sending data always prepended with the zeros. I was using this as a number in my database, so this is what I did for removing the leading zeroes from the string
Sometimes you want to have a subdirectory on the master branch be the root directory of a repository’s gh-pages branch. This is useful for things like sites developed with Yeoman, or if you have a Jekyll site contained in the master branch alongside the rest of your code. Also, a lot of static site generators create a build folder called ‘dist’ or ‘static’. I was recently using foundation CLI at this website and I came across this problem that I wanted to show all the changes quickly to the client I was building this form. So I used the following process.
For the sake of this example, let’s pretend the subfolder containing your site is named dist.
Step 1
Remove the dist directory from the project’s .gitignore file (it’s ignored by default by Yeoman).
Step 2
Make sure git knows about your subtree (the subfolder with your site).
Thanks for the informative website. Where else can I get that kind of info written in such an ideal approach?
I have a challenge that I am just now working on, and I’ve been on the look out for such info.
Sometimes, all you need is quick command which you can run and land on a directory you access most. In this case, on one of my test servers, I wanted to go to the main directory where I keep my websites. I just created an alias called ‘home’ and now I also use this to go into my directory.
alias home='cd /var/www/'
No memorization needed anymore and no time wasted finding where I kept everything.
You can do a mongodump of a collection from one database and then mongorestore the collection to the other database.
I used robomongo and it a documents view, just used this query with the right names for collections and databases, I was able to copy around 3000 records in less than a second on my old Macbook air.
So you just committed some code only to realize you had a few files missing from your commit, which you forgot to add. This happens a lot in everyday coding and git has an easy fix for that. If you are like me, you have done something like resetting the head to the commit before last commit and then committing files again but that’s a hack and it’s avoidable. How? Keep reading.
Use git add -u or git add . to stage the new changes or add any missing files.
And then for updating the last commit with newly added / staged files use --amend argument.
git commit --amend
If you just want to update the commit message, use -m option for adding a commit message. If you don’t specify it, you will be prompted with the previous commit message as a default, in a standard vim mode.
For someone new to vim: You could use escape key to get out of editing mode with wq which edits and saves your commit
When you’re done you can check git log --stat to see your amended commit with the extra changes.
Just released a new version of Github pages which you can find at fahdi.github.io. Github pages are an interesting feature of github to update a personal portfolio or a project page or any static (or dynamic: look at PakistanJS). You push your code as a static html, css and javascript files or in this case I have used jekyll to have some configuration in place and used plain jQuery to pull in my github repos dynamically.
Teamwork API is great but while while working on it, I badly felt the need of being able to test different end points before putting them into my app. Naturally, I tried looking for it on the web and didn’t find anything, so I finally created my own collection and here it is, for you to use.
Please open a PR if you improve anything. The collection uses a global i.e {{TeamworkHost}} which can be added in a user defined Postman environment for your own team URL, as in the following screenshot.
If you’re using the Angular 2 beta version and want to upgrade to angular 2 final release candidate, it’s a good idea to be aware of the breaking changes from beta to final release. I followed a few youtube tutorials and struggled with it, so here is what i learnt:
Most of the content that was posted earlier on internet forms and youtube tutorials deals with Angular 2 beta. I waited for quite a while to move to angular 2 so that it matures before adopting it. But even then the relevant study material is old and you can’t really follow it word by word.
Module Names
Angular packages have been renamed from angular2 to @angular.
import { Component } from ‘@angular/core’;
Components and Directives
In Beta, you had to register a component or directive in directives attribute of the host component:
@Component({
directives: [FirstComponent, SecondComponent]
})
export class MyCoolComponent { }
In the Final Release, directives is removed from component metadata.
You should register all components and directives in AppModule (The main app’s root module) or the module they belong to (on sub level). Like the following:
With angular’s final release, the syntax for for loop has changed. You should now be using the let keyword. here is how to upgrade all your for loops to new syntax.
In Beta:
*ngFor=“#course of courses”
In the final version:
*ngFor=“let course of courses”
Note: I’d keep this post updated if I find anything else as I continue with my study.
So I am learning Angular 2 and decided to use Angular CLI this time instead of those seed projects.
BTW, I am following this course to study basics of Angular.
sudo ng new angular-app
And this is what happened
Password:
As a forewarning, we are moving the CLI npm package to "@angular/cli" with the next release,
which will only support Node 6.9 and greater. This package will be officially deprecated
shortly after.
To disable this warning use “ng set –global warnings.packageDeprecation=false”.
installing ng2
create .editorconfig
create README.md
create src/app/app.component.css
create src/app/app.component.html
...
...
...
create karma.conf.js
create package.json
create protractor.conf.js
create tslint.json
Directory is already under version control. Skipping initialization of git.
Installing packages for tooling via npm.
It was stuck at the
Installing packages for tooling via npm
for ages and since I didn’t want to give up using the CLI, here is how I solved it.
$ ng new angular-app --skip-npm
As it was stuck on npm packages install process and using –verbose with the command didn’t help, I used above command to skip installing npm packages and did it manually.
So after above, I used
$ cd angular-app
to get into the directory and then installed npm via
$ npm install
that install script takes care if everything else that CLI does if it all works normally.
Now, since we are using the CLI, here is how to compile and run it
$ ng serve
and it should show something like following when it all works nicely.
Over the last 10 years, I have learnt that sharing code on websites can be tricky. Just yesterday I was working on a blog post and realized that anything between the <code></code> can look ugly if not styled properly.
For example, after writing some bash code within the post, it looked quite ugly:
Now, I just investigated how my CSS looked like. And the CSS to style it was barebones, like this
code,
kbd,
tt,
var {
font: 15px Monaco, Consolas, "Andale Mono", "DejaVu Sans Mono", monospace;
}
Clearly, not acceptable and it looked plain and ugly. I didn’t want to use third party JS based highlighter and wanted to use clean CSS approach. I don’t need complex syntax highlighting and basically didn’t want to mess with simplicity and cleanliness of my site. In short, no time for bullshit. So here is what I did
code,
kbd,
tt,
var {
font: 15px Monaco, Consolas, "Andale Mono", "DejaVu Sans Mono", monospace;
background: #eee;
color: black;
margin: 20px 0;
display: block;
padding: 20px;
}
Now, that’s more like it and it would simply add some nice greyish background, fix some margins and padding around the code and in a way, highlight it so it’s well separated from the general text. Problem solved! Nice and simple.
My cache wouldn’t clear automatically, so I simply cleaned the browser cache but had already tested with the incognito mode, the best way to test websites while you are developing them for a new customer base or just want to see quick results instead of cleaning the browser cache. Here is the final result:
This works perfectly on the white background of my site. I am a happy man and all it took was 20 minutes to do that and also write this blog post about it. A quick lesson in usability and user experience 🙂
Recently, I had a friend setup a CentOS server for me as I personally have experience setting up and managing lamp on Ubuntu servers but needed a CentOS expert. Anyhow, the way to manage Apache on CentOS is a bit different than how it’s done on Ubuntu or debian based Linux. So I was trying to open my site and it wouldn’t connect. After ssh-ing into my box, I realized apache is not working and I needed to restart. Usually on Ubuntu you’d do
httpd restart
but it doesn’t really work, so after a couple minutes of googling, I found it
apachectl start
And, here is the error I got
Syntax error on line 295 of /etc/httpd/conf/httpd.conf:
DocumentRoot '/var/www/html' is not a directory, or is not readable
Now issue was, I deleted the directory without checking if it’s used by apache as I usually remove the default config of apache using `html` directory for default site config. I know, noob mistake. Anyhow, I created that director again and this time
apachectl start
worked like a charm. But I didn’t give up on the httpd command just yet. So tried it again
# httpd restart
Usage: httpd [-D name] [-d directory] [-f file]
[-C "directive"] [-c "directive"]
[-k start|restart|graceful|graceful-stop|stop]
[-v] [-V] [-h] [-l] [-L] [-t] [-S]
Options:
-D name : define a name for use in directives
-d directory : specify an alternate initial ServerRoot
-f file : specify an alternate ServerConfigFile
-C "directive" : process directive before reading config files
-c "directive" : process directive after reading config files
-e level : show startup errors of level (see LogLevel)
-E file : log startup errors to file
-v : show version number
-V : show compile settings
-h : list available command line options (this page)
-l : list compiled in modules
-L : list available configuration directives
-t -D DUMP_VHOSTS : show parsed settings (currently only vhost settings)
-S : a synonym for -t -D DUMP_VHOSTS
-t -D DUMP_MODULES : show all loaded modules
-M : a synonym for -t -D DUMP_MODULES
-t : run syntax check for config files
This time I realized I was doing it without the extra params I had to give to httpd command, so here we go:
[-k start|restart|graceful|graceful-stop|stop]
Now I just wanted to try it
# httpd -k stop
# httpd -k restart
So the first one stops it, I checked after and the site won’t load. That means it worked. Then I simply used restart ( I could have used -k start to start as well) to get it up and running.
This time, above command works perfectly. So a little but of learning today. Moral of the story, just try a little harder before googling it.
While I was doing some debugging, I badly needed to compare two Javascript objects that I could easily print into console but then was having a hard time remembering all of the differences from one object to the other one. This was a major API change and I couldn’t just get everything off my memory. So I just needed a quick way of copying the JS object into clipboard. With chrome dev tools, I printed both the objects to console with console.log and then used Sublime comparison tool to compare them both and easily note the differences when I needed to.
Here is how to do that:
Right click on the variable or object and then click on “Store as Global Variable”. Since I had two variables that I needed to compare. I saved them both in the same manner.
Chrome Dev Tools – Save console log object into global variable
Chrome Dev Tools – Objects after being saved as Global Variables
Now for the final step, I used copy(temp1) and copy(temp2) to get the variables stored in the clipboard and then subsequently copied them into different files to compare. You can use any file comparison tool for the visual difference which helps with quickly noticing what changed and modify whatever code changes are need for new object in the application. In my case, I used, ‘Compare Side-By-Side‘ package for Sublime Text. As you can see, I can quickly able to see the differences between the properties of object as we changed the API from V1 to V2.
Side-by-side file comparison & difference tool for ST2/3
My mom always says, just do it! Or was it Nike? Anyhow, I plan on actually just doing it. Like this guy!
So I am going to challenge myself everyday. It’s 6:35 AM in the morning and I have been up all night. I just want to reinvent myself in the best possible may. May be I just need some new learning. My day job involves Angular 1 so I am kinda bored and want to learn new skills. So here we go:
I am starting with installing and getting a basic “Hello World” app with react. The idea is to spend 30 minutes everyday and share my experiences with you.
All you need to know for this is a little bit about node and what is npm. That’s all. Then simply go into the terminal and type as follows
npm install -g create-react-app
create-react-app hello-world
cd hello-world
npm start
Now after that, you should have a working app which you can open in your browser and have fun looking at it. I’d post more tomorrow. Stay tuned.
Note: This is my personal journey with the react learning and I’d keep posting everyday.