NFS configuration in HA Openshift Cluster with built-in AWS

October 7th, 2020 | by Saicharan Adavelli

DevOps Engineer| hands-on with Cloud Platform Infrastructure-AWS| Redhat Openshift Certified
CloudGen Systems Private Limited
USA| Canada| India

photo

Making data consistent, accessible and retainable , How its done.

How are stateful workloads handled ? How are distributed pods able to access the data all time ? How is accessible data made consistent ? , at times of failure , How is data retained ?

These are some of the questions that intrigues most of us while running our application on an High Available (HA) Openshift cluster .

Firstly the pod workloads are handled by distributing it across the cluster to make the application highly available, it can be done by:

Using replica sets and Anti-Affinity rules

  • Replica sets gives us the pod count, increased number of pods would ensure that specified number of pods are available in an healthy state at any point of time.
  • Pod workload is handled by scheduling these pods on different computable nodes of the cluster and also making sure that no two similar app pods are on same node
  • For this purpose, combination of nodeAffinity and podAntiAffinity rules are set so that the application pods are distributed evenly.

Secondly, data consistency, accessibility and retainability is achieved by provisioning NFS storage that uses persistent volumes.

  • Here, NFS acts as persistent file storage device and we can make this storage instance available to cluster using static provisioning which requires creation of persistent volumes manually
  • Hence, NFS acts as persistent file storage device and creation of NFS directory acts as a shared volume

Below is the procedure for setting up :

1. Creating NFS-server EC2 Instance

  • Launch an Amazon Linux 2 AMI EC2 Instance (supports centos, rhel and fedora)
  • Configure Instance details with Openshift cluster VPC , associate related public subnet in the same zone , proceed with default settings.
  • configure security groups , allow NFS port 2049 .
  • proceed for launching the instance .
  • Now access the AWS terminal through putty and proceed for NFS- server installation

2. Installing the NFS-server

  • Install these packages on the CentOS server with yum
  • yum install nfs-utils

  • Create the directory that will be shared by NFS:
  • mkdir /var/nfsshare

  • Change the permissions of the folder as follows:
  • chmod -R 755 /var/nfsshare
    chown nfsnobody:nfsnobody /var/nfsshare

  • Start the services and enable them to be started at boot time:
  • systemctl enable rpcbind
    systemctl enable nfs-server
    systemctl enable nfs-lock
    systemctl enable nfs-idmap
    systemctl start rpcbind
    systemctl start nfs-server
    systemctl start nfs-lock
    systemctl start nfs-idmap

  • Share the NFS directory over the network as follows:
  • vim /etc/exports
    /var/nfsshare 192.168.100.2 (rw,sync,no_root_squash,no_all_squash) ##192.168.100.2 is the NFS client IP
    ## Validate with below command
    showmount -e 192.168.100.1 ## gives exports list.

  • Finally , start the NFS service
  • systemctl restart nfs-server

This completes setting up of NFS-server

3. At NFS-Client End

  • Connect to worker nodes of openshift cluster which are NFS-Clients .
  • ## NFS Client installation inside worker nodes
    yum install nfs-utils # need to install for non-coreOS

  • create the NFS directory mount points:
  • mkdir -p /mnt/nfs/var/nfsshare

  • we will mount the /var/nfsshare directory:
  • mount -t nfs 192.168.100.1 :/var/nfsshare /mnt/nfs/var/nfsshare/ #192.168.100.1 is the NFS-server IP

  • check if its mounted correctly :
  • df -h

4. Deploying Storage Class

  • Below is the sample storage class used , one can edit storage class name and the provisioner name accordingly.
  • #vim class.yml
    apiVersion: storage.k8s.io/v1
    kind: StorageClass
    metadata:
    name: nfs-storage
    annotations:
    storageclass.kubernetes.io/is-default-class: "false"
    provisioner: example.com/nfs
    parameters:
    . archiveOnDelete: "false"

  • Once updated the class.yml file we can execute the file using oc create
  • # create storage class file
    oc create -f class.yml
    #check if storage class got created
    oc get storageclass

5. Creating Persistent Volume (PV)and Persistent Volume Claims (PVC)

  • PV contains file storage details like NFS server IP and NFS shared path, thus it gets pointed to file storage device , that is to the NFS server. /b
    • Using a reclaim policy of “retain”, a persistent volume will not be deleted or erased. Hence the relationship between pod and storage can be re-established

    ## create pv
    kind: PersistentVolume
    apiVersion: v1
    metadata:
    name: pv-nfs
    spec:
    capacity:
    storage: 1Gi
    accessModes:
    - ReadWriteMany
    persistentVolumeReclaimPolicy: Retain
    storageClassName: nfs-storage
    nfs:
    server: 192.168.100.1 ## nfs server IP
    path: /var/nfsshare ## nfs server shared path
    ## creat pvc
    kind: PersistentVolumeClaim
    apiVersion: v1
    metadata:
    name: pvc-nfs
    namespace: < project name >
    spec:
    accessModes:
    - ReadWriteMany
    resources:
    requests:
    storage: 20Mi
    volumeName: pv-nfs ## should be same as pv name
    storageClassName: nfs-storage

    ## to create above scripts
    oc create -f pv.yml
    oc create -f pvc.yml

6. Creating Pods to use Persistent Volume Claims

  • Pods utilize this shared volume using persistent volume claim(PVC)
  • apiVersion: apps/v1
    kind: Deployment
    metadata:
    name: deploy-nfs
    labels:
    app: deploy-nfs
    spec:
    replicas: 2
    selector:
    matchLabels:
    app: deploy-nfs
    template:
    metadata:
    labels:
    app: deploy-nfs
    spec:
    containers:
    - name: deploy-nfs1
    image: busybox
    command:
    - sleep
    - "3600"
    resources:
    requests:
    cpu: "0.01"
    volumeMounts:
    - name: pv-nfs
    mountPath: /mydata
    affinity:
    nodeAffinity:
    requiredDuringSchedulingIgnoredDuringExecution:
    nodeSelectorTerms:
    - matchExpressions:
    - key: kubernetes.io/hostname
    operator: In
    values:
    - < worker1 hostname >
    - < worker2 hostname >
    - < worker3 hostname >
    podAntiAffinity:
    requiredDuringSchedulingIgnoredDuringExecution:
    - labelSelector:
    matchExpressions:
    - key: app
    operator: In
    values:
    - deploy-nfs
    topologyKey: "kubernetes.io/hostname"
    volumes:
    - name: pv-nfs
    persistentVolumeClaim:
    claimName: pvc-nfs

    # create the pod using deployment.yml file
    oc create -f deployment.yml
    # check for pods status
    oc get pods -n < project name >

7. NFS Validation

  1. NFS-Server end, create a sample file and check if getting replicated to nfs client and in pod as well.
  2. NFS-Clients end, create a sample file and check if getting replicated to nfs server and in pod as well.
  3. At Pod level, create a file in the container mount path i.e /mydata , it would get reflected in NFS-server and in NFS-client machines.

Takeaways

we can now make our data persistent , accessible and retainable through NFS by means of static provisioning that requires manual creation of persistent volumes.

Whats ahead ?

Can we claim the persistent volume without actually having to create it ?

Yes , without actually creating a persistent volume we can allocate storage volume just by creating persistent volume claim , this reduces time in bringing up the application.

In next post we will see how a dynamically provisioned NFS storage is set up .

SmartEDI Challenges Small Companies Face to Overcome the Supply Chain Hurdles

Challenges Small Companies Face to Overcome the Supply Chain Hurdles

Small companies operate a tight ship due to various limitations. Their budget, workforce, cash flows, and client base are some of the constraints to consider. Also, they struggle with taxation policies, government regulations, and compliance rules. Moreover, their profits and productivity depend on the quality of the supply chain.

Unfortunately, small company managers encounter many problems here. Procurements get plagued by miscommunication and unnecessary delays. Every trading partner does not give too many pricing breaks to suit small companies, and any little inefficiency in product delivery and shipping models will cause significant backlog and loss. 

Thankfully, computer and internet technologies are all prevalent in the world. By investing in automation and digitization, some hurdles can become smaller. The supply chain workflows can become streamlined and electronic data interchange (EDI) can create efficiency. But the problems also become more diverse and challenging for the supply chain managers.

Main obstacles

Manual processes

Data entry and retrieval using manual methods are inefficient and slow. It is challenging to generate error-free reports, invoices, and purchase orders. Also, the paperwork has to get organized, managed, and archived using tedious procedures. An increase in manual labor and complexity to do all this work would be tremendous and redundant.

In contrast, computerization and mobile access are accurate and fast. The manager can retrieve pertinent data in real-time. By adopting new technology and automation (RPA), data storage, processing, and management also become simplified. And cloud connectivity, mobile data collection, and voice-controlled solutions offer superior strategies.

IT security

Supply chain processes require regular network access for data. The integrity, accuracy, safety, and security of data come into question. The best supply chain solution with security leaks is a significant liability and a costly lawsuit. Hackers target weaker networks and ransom a heavy price after stealing essential data.

Automated solutions with due diligence for preventive security is essential. Cybersecurity has to be a priority, and encrypted methods are becoming basic standards. The best security software assures the manager’s peace of mind and productive efficiency. 

It is not just about the security of your infrastructure. The data security of every partner in your supply chain is essential. Data leakage can occur at any point in the supply chain if you do not have trade partners who take data security as seriously as you do. It is better to break off a relationship with such weak trade partners. If their IT infrastructure is not fool-proof, then you, too, will become vulnerable. The other challenge here is the budget to incorporate those changes.

Third-party logistics

Some companies have the infrastructure, resources, and expertise in storing and shipping products. Partnering with them is not as straightforward as it looks. Third-party logistics can quickly turn into a nightmare even for proficient managers. But the advantages are one too many to ignore, so many businesses go ahead.

You can save storage space, reduce costs, lower risk, and save time. You can also improve the productivity of supply chain workflows. But do not rush into signing a contract with any 3rd party partner. Look into their credentials and proceed after doing a thorough cost-benefit analysis.

There are too many third-party logistics services providers in the market. Yes, they manage the warehousing, inventory, and fulfillment processes. For small companies, this may seem like a great relief. But such outsourcing brings challenges related to contracts, tools, data security, management, and automation.

Cost efficiency

Supply chain operating costs grow with new technologies. They also rise because of fuel, energy, and HR expenses. But cutting down costs is not an easy hurdle to cross for the management. Small companies have to introduce innovation without creating workforce resentment.

They can rely on mobile data collection tools to lower inventory mistakes. It will help improve the company’s sales in the long run. Tools like product trackers, barcode scanners, and digital forms are vital for controlling costs. Cost efficiency also is possible with cutting unwanted resources, automation, outsourcing, and so on.

HR and teamwork

Reducing HR costs is always high on the agenda of a small company. But supply chain processes are not only crucial but also complicated. Besides, the data-intensive workflows have to be accurate, prompt, and reliable. Such an activity requires teamwork and efficient duty delegation.

Supply chain managers can get vexed by the dynamic environment. Real-time events are unpredictable, and experts are essential. Scalability becomes a nightmare in many cases. Every company aspires to change its capacity to adapt to the seasonal change in demand. But finding experienced candidates and recruiting them can be an expensive affair. Besides, they have to be on-boarded and acquainted with the office culture.

Building a reliable team takes time, and managers have to be patient. They should factor in business needs, inter-departmental conflicts, training costs, etc. Ultimately, hiring the right person for the right job is essential to ensure productivity and profits. 

Conclusion

To sum up, small businesses have to do a fine, balancing act. They have to manage the supply chain for qualitative and cost improvements. At the same time, they have to deliver customer satisfaction and lower the risk of doing business in a competitive environment.

Otherwise, the company not only loses customers but also trading partners. With technological infusion, efficiency improved a great deal. But the threats are also equally alive, and overcoming these hurdles is paramount.

SmartEDI, can be your partner to easily onboard the customers to kick start the EDI transactions in a smarter way. Our solution is easy to install and run efficiently irrespective of any infrastructure challenges while being highly portable, flexible and scalable. An assured increase of 80% to 90% is observed in on-boarding new partners. The time and money invested are drastically brought down to 10% to 20% for existing solutions. Check out the link to know more.

EDI Everywhere

How fortune 500 corporations widely use EDI

With the footprints traced to decades back, Electronic Data Interchange is one of the swiftest and simplest modes of data transfer. The ease of communicating online and sharing the details with the various stakeholders in real-time, EDI has proven to be one of the tools to benefit all the business sizes.

With years of successful implementation in large Fortune 500 organizations with measurable returns on investment, EDI has a proven track record of positive impact even when the revenues hit the low trail.

The efficiency of data integration increased turnover, and reduced cost, the paradigm shift of business on profits due to the EDI implementation has been a prime point of success. With the dramatic cost-saving and better benefits linked to modest operations, the EDI is extensively applied by the Fortune 500 companies.

Benefits and Use of EDI

The benefits that the EDI offers are unmatched with any of the other business implementation strategies. Developed to solve a multitude of problems, the EDI is now widely used by various organizations. By facilitating automation and smooth functioning, EDI has multiple benefits to offer to the companies.

The use of EDI is related to environmentally sustainable practices in several ways. The reduced chances of the errors and quick processing of the request, has ensured retailer compliance and better product distribution. The great time to reduce the throughput of processes and reduce the potential errors, EDI is very supportive of the fortune 500 companies.

The benefits that these companies have extracted from the application of EDI are:

Quick Transaction processing

With reduced paperwork and digitization, the speed by which the transactions are processed, and communication is established is enhanced to a great extent. With quick and easy retrieval of files, allowing faster turnaround times, and higher customer satisfaction, EDI is used by Fortune 500 companies.

Reduced errors

Manual data entry is slow with the chances of a high clerical error. With the implementation of the automated data entry and all the system related to data processing in the digital mode, the possibility of human error decreases.

Attaining Accessibility

With the real-time processing and the access of data round the clock, the tracking is stratified a lot. With such an implementation of EDI, the company can get the benefit of staying at the top of the competition. The streamlined servicing and standardized processes, businesses can manage accounts and inventory more accurately.

Enhanced Efficiency

The paradigm shift that has been seen in the wide usage of the EDI by fortune 500 companies is in the area of productivity. With increased efficiency, the business can directly impact customer relationships and add value to the company.

High Security

The businesses are best when they assure the high security of data stored and transmitted. With the EDI integration, the data is secured end-to-end without any leakage. EDI makes it easier to track and store data for auditing purposes to meet industry compliance standards.

Companies using EDI

The decades of implementation ample number of industries are already using EDI in their day-to-day operations. With the extensive usage of the EDI, the businesses are getting benefited from various advantages. The top sectors in the Fortune 500 companies that have widely implemented the EDI are:

Pharmaceutical Industry

Relying on the extreme level of accuracy, the pharmaceutical industry needs the highest level of data security and accuracy. With EDI integration, the channel of communication is streamlined a lot. The electronic process has significantly reduced errors and turnaround time.

Retail Industry

The customer being the centre of the business of the retail industry, the need to reduce the overhead costs by reducing errors, turnaround times, and labour costs is a must to ensure the best pricing. With the implementation of EDI, the companies in this segment have assured the best services at a premium cost.

Automotive Industry

The weekly analysis of stock required to ensure no shortage in the manufacturing process by improving communication with material suppliers, and eliminating the need for invoices is the key area of improvement that has been focused by the EDI.

Financial Industry

With a wide area of financial transaction and operations which are customer-driven and the need for accurate data, the EDI integration in the financial companies is magical. With the swift and quick communication that takes place between banks and customers or between banks, EDI ensures no data leakage and high security of the companies.

Tech Industry

Allowing quick and hassle-free communication is the key to the success of various tech companies. The use of EDI is the prime factor that the tech industry has favoured to handle numerous business processes right from design to manufacturing and supply management.

Conclusion

With global connectivity and extensive implementation, the Fortune 500 companies have benefited a lot from the EDI. Electronic data interchange is a hassle-free and quick way to stay connected in the business world.

A tool that is seen as the game-changer for a decade, has proved to be effective, efficient, and smart in various aspects and sectors. Working with the industry specialist, EDI can help to identify potential improvements and earn a big fortune by the analysis of the same. Thus, implementing EDI is one of the core competencies of the Fortune 500 companies and various small enterprises in today’s scenario.

CloudGen © 2019 All Rights Reserved